Share via

Having trouble ingesting data into LAW from large custom log file via DCR

Josh Madakor 30 Reputation points
2026-02-09T05:39:47.6666667+00:00

I want to ingest this log file, which is a .json file, but the contents is actually JSONL, into a Log Analytics workspace. I have tried to create the table with an appropriate schema and configure the DCR ant put the file on a virtual machine (windows-target-1) to ingest, but it simply does not ingest into the table. Would you be able to help me with this? Maybe create the script for the table creation or DCR config or whatever we need to do? here is the log file:

https://drive.google.com/file/d/1C392QOI20p1fyUD1rxCdD4bcPuF39XlC/view?usp=sharing

Azure Monitor
Azure Monitor

An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.

{count} votes

2 answers

Sort by: Most helpful
  1. Suchitra Suregaunkar 9,505 Reputation points Microsoft External Staff Moderator
    2026-02-09T06:23:59.4433333+00:00

    Hello Josh Madakor

    The issue occurs because Azure Monitor custom JSON logs via DCR only ingest data when each log entry is a valid, single‑line JSON object (JSONL) and matches the DCR/table schema requirements. If the JSON file does not meet these requirements, Azure Monitor Agent silently skips ingestion (no explicit error is shown in the portal).

    Below are the reasons that might blocking the ingestion:

    1. JSONL format requirement
      • Azure Monitor does not support JSON arrays or multi‑line JSON objects for file‑based ingestion.
      • Each line in the file must be a standalone JSON object.
      • If even some lines are invalid, they are skipped.
    2. TimeGenerated requirement
      • A TimeGenerated field is mandatory for custom tables.
      • If the JSON does not contain it, ingestion will not occur unless a DCR transformation adds it.
    3. Schema mismatch causes silent drop
      • If JSON properties or data types do not exactly match the custom table schema, records are ignored without error.

    So, as a resolution try below Workarounds:

    1. Create a minimal custom table first
      • Columns:
      • TimeGenerated (datetime)
      • RawData (string)
    2. Use “Custom JSON Logs” as the DCR data source
      • Not Custom Text Logs
      • File path must be exact (Windows path + correct wildcard)
    3. Add a transformation to guarantee TimeGenerated
    source
    | extend TimeGenerated = todatetime(now())
    | extend RawData = tostring(source)
    
    
    
    

    4. Ensure the log file is true JSONL

    • One valid JSON object per line
    • UTF‑8 encoded
    • New lines appended after DCR association (agent does not back‑read old data)

    Once this was aligned, ingestion started successfully and data appeared in the custom table.

    You can also refer below documents for more details:

    1. https://learn.microsoft.com/en-us/azure/azure-monitor/vm/data-collection-log-json
    2. https://learn.microsoft.com/azure/azure-monitor/logs/ingestion-time-transformations

    Thanks,

    Suchitra.


  2. Q&A Assist
    2026-02-09T05:39:56.41+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    To ingest a JSONL file into a Log Analytics workspace using a Data Collection Rule (DCR), ensure that your setup adheres to the following guidelines:

    1. File Format: Make sure your JSONL file is correctly formatted. Each entry must be a single line of JSON (JSON Line format), and the file should be stored on the local drive of the agent machine in the directory being monitored.
    2. DCR Configuration: When creating the DCR, ensure that you specify the correct data source type as Custom (Log files) and that the stream declaration follows the format Custom-<TableName>. You can only add a single workspace to a DCR for a custom JSON log data source.
    3. Table Creation: If you haven't already, create the custom table in your Log Analytics workspace. The table must have the appropriate schema to match the data being ingested. For JSONL, you typically need columns that correspond to the fields in your JSON.
    4. Permissions: Ensure that you have at least the Contributor role in the relevant Azure Workspace to perform these actions.
    5. Monitoring and Troubleshooting: If the ingestion does not occur, check the following:
      • Verify that the DCR is correctly configured and associated with the right Log Analytics workspace.
      • Ensure the JSONL file is being monitored by the Azure Monitor agent.
      • Check for any error messages in the Azure portal that might indicate what went wrong.
      • Refresh the Azure portal to ensure all settings are updated.

    If you continue to experience issues, consider using a PowerShell script to create the custom table and configure the DCR, following the steps outlined in the Azure documentation for creating custom tables and data collection rules.


    References:


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.