How to view failed requests when uploading data via log ingestion endpoint?

42726446 40 Reputation points
2024-06-20T12:49:57.85+00:00

I've been following your tutorials on uploading data to a log analytics workspace:

  1. https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-api?tabs=dcr#create-new-table-in-log-analytics-workspace
  2. https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-code?tabs=python

However, I noticed that when I try to upload logs that do not exactly match the DCR's column definitions (through a DCE via the provided Python script), then I don't get an HttpResponseError in the Python script, but the logs never make it to the log analytics workspace.

Is there any way for me to check whether for errors in the DCE->DCR->workspace pipeline? Right now the only way for me to check whether I made a mistake is to wait for 10 minutes and if the logs don't show up, I probably did something wrong.

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
2,960 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Monalla-MSFT 12,761 Reputation points
    2024-06-20T15:21:42.7333333+00:00

    @42726446 - Welcome to Microsoft Q&A and thanks for reaching out to us.

    When you send data to a Log Analytics workspace through a DCR, the data must match the schema defined in the DCR. If the data does not match the schema, it will not be ingested into the workspace.

    To check for errors in the pipeline, you can use the Azure Monitor Logs REST API to query the Operation table in your workspace. It contains error messages and warnings that occur in your workspace. You can create alerts for issues with the level of Warning and Error.

    Here is an example of how to query the Operation table using the Azure Monitor Logs REST API:

    POST https://api.loganalytics.io/v1/workspaces/{workspaceId}/query Content-Type: application/json Authorization: Bearer {token} { "query": "Operation | where TimeGenerated > ago(1d) | project TimeGenerated, Category, Operation, Level, Detail" }
        
    

    Replace {workspaceId} with the ID of your workspace and {token} with an Azure AD access token that has the Log Analytics Reader role for your workspace.

    This query will return all the records in the Operation table for the last day. You can modify the query to filter for specific categories or levels of issues.

    Hope this helps. and please feel free to reach out if you have any further questions.


    Please don't forget to "Accept as Answer" and click "Yes" if the above response is helpful, so it can be beneficial to the community.