An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
Hello Josh Madakor
The issue occurs because Azure Monitor custom JSON logs via DCR only ingest data when each log entry is a valid, single‑line JSON object (JSONL) and matches the DCR/table schema requirements. If the JSON file does not meet these requirements, Azure Monitor Agent silently skips ingestion (no explicit error is shown in the portal).
Below are the reasons that might blocking the ingestion:
- JSONL format requirement
- Azure Monitor does not support JSON arrays or multi‑line JSON objects for file‑based ingestion.
- Each line in the file must be a standalone JSON object.
- If even some lines are invalid, they are skipped.
-
TimeGeneratedrequirement- A
TimeGeneratedfield is mandatory for custom tables. - If the JSON does not contain it, ingestion will not occur unless a DCR transformation adds it.
- A
- Schema mismatch causes silent drop
- If JSON properties or data types do not exactly match the custom table schema, records are ignored without error.
So, as a resolution try below Workarounds:
- Create a minimal custom table first
- Columns:
-
TimeGenerated(datetime) -
RawData(string)
- Use “Custom JSON Logs” as the DCR data source
- Not Custom Text Logs
- File path must be exact (Windows path + correct wildcard)
- Add a transformation to guarantee
TimeGenerated
source
| extend TimeGenerated = todatetime(now())
| extend RawData = tostring(source)
4. Ensure the log file is true JSONL
- One valid JSON object per line
- UTF‑8 encoded
- New lines appended after DCR association (agent does not back‑read old data)
Once this was aligned, ingestion started successfully and data appeared in the custom table.
You can also refer below documents for more details:
- https://learn.microsoft.com/en-us/azure/azure-monitor/vm/data-collection-log-json
- https://learn.microsoft.com/azure/azure-monitor/logs/ingestion-time-transformations
Thanks,
Suchitra.