Azure Data Explorer ingestion error

Sergey 0 Reputation points

Following a recent update on October 24, 2023, we encountered a substantial increase in ingestion errors. This issue affected both of our connectors, namely IotHub and EventHub.

To illustrate these ingestion problems, we have provided a graph below. The blue line represents successful ingestion counts, while the orange line represents failures. The timestamps are in the EST timezone (04:00 - 04:28 UTC).image.png

We came across the subsequent event in the logs (with additional data omitted for conciseness):

"channels": "Operation",
"correlationId": "89a39032-7e53-45d9-b295-053ada8ae8f4",
"description": "Version deployment on the Azure Data Explorer cluster",
"eventTimestamp": "2023-10-24T08:20:53.7074828Z"

Error received by the request .show ingestion failures

BadRequest_InvalidBlob: hr: '2161770525' 'ExpectedObjectKey at character 499 ('}')'

We identified two breaking changes introduced by this update:

  1. Trailing commas are no longer permissible. The previous version allowed payloads like this:
{"following_comma":"was previously accepted",}
  1. Newlines are also no longer accepted, and this poses a significant problem for the devices and services that are currently deployed:
  "this":"is a valid json, but won't work"
Azure Data Explorer
Azure Data Explorer
An Azure data analytics service for real-time analysis on large volumes of data streaming from sources including applications, websites, and internet of things devices.
431 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Sander van de Velde 25,581 Reputation points MVP

    Hello @Sergey,

    welcome to this moderated Azure community forum.

    I'm sorry to read about the changed ingestion constraints, causing the ingestion errors.

    Although I understand how the message is constructed, the trailing comma is not valid JSON.

    The (JSON) ingestion supports only valid JSON, so this conversion/mapping will not work.

    If you are not able to change the logic on the devices, you could try to fix it by adding custom logic in between

    the gateway and Azure Data Explorer (eg. An Azure Function or Azure Stream Analytics).

    Otherwise, you could ingest the raw data into a dynamic column in a 'source' table and use a Table Update Policy to transform the message.

    If the response helped, do "Accept Answer". If it doesn't work, please let us know the progress. All community members with similar issues will benefit by doing so. Your contribution is highly appreciated.

    1 person found this answer helpful.
    0 comments No comments