@42726446 - Welcome to Microsoft Q&A and thanks for reaching out to us.
When you send data to a Log Analytics workspace through a DCR, the data must match the schema defined in the DCR. If the data does not match the schema, it will not be ingested into the workspace.
To check for errors in the pipeline, you can use the Azure Monitor Logs REST API to query the Operation table in your workspace. It contains error messages and warnings that occur in your workspace. You can create alerts for issues with the level of Warning and Error.
Here is an example of how to query the Operation table using the Azure Monitor Logs REST API:
POST https://api.loganalytics.io/v1/workspaces/{workspaceId}/query Content-Type: application/json Authorization: Bearer {token} { "query": "Operation | where TimeGenerated > ago(1d) | project TimeGenerated, Category, Operation, Level, Detail" }
Replace {workspaceId}
with the ID of your workspace and {token}
with an Azure AD access token that has the Log Analytics Reader
role for your workspace.
This query will return all the records in the Operation table for the last day. You can modify the query to filter for specific categories or levels of issues.
Hope this helps. and please feel free to reach out if you have any further questions.
Please don't forget to "Accept as Answer" and click "Yes" if the above response is helpful, so it can be beneficial to the community.