Is it possible to directly find correlated errors in Application Insights to a Service Bus message ID via Code?

Alexander Fritsch 1 Reputation point

What we have:
A service bus topic with dead lettered messages and Application Insights connected to the Service Bus.

What we want to do:
Find the errors/logs (specifically: Transaction diagnostics) correlated to that message by the message ID, preferably. Is there a way to do this? (e.g. Telemetry Client or some REST API call)

Something like this maybe:

var client = someClient(maybeAConnectionString);
var listOfErrors = client.GetErrors(messageId);

After reading through a bunch of documentation from Microsoft I'm more confused than lead into the right direction.

Side info:
We're using C#. Deprecated packages are discouraged, so a solution featuring up-to-date packages would be good.

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,026 questions
Azure Service Bus
Azure Service Bus
An Azure service that provides cloud messaging as a service and hybrid integration.
594 questions
An object-oriented and type-safe programming language that has its roots in the C family of languages and includes support for component-oriented programming.
10,640 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. MughundhanRaveendran-MSFT 12,451 Reputation points

    Hi @Alexander Fritsch ,

    Thanks for reaching out to Q&A forum.

    It appears that you are looking for way to track the requests and find out the logs/errors by filtering with a message ID. It is possible to figure out the errors that correlates to an ID, however I dont see a way to filter based on message id. It can be traced and filtered using Diagnostic id. This id is an unique identifier of an external call from producer to the queue. Refer to W3C Trace-Context traceparent header for the format

    You can trace message processing based on Diagnostic Id , please refer :

    You can query the requests table in the Log analytics workspace via kusto query for finding errors by using the diagnostic id. Sample query is shown in the above article.

    I hope this helps! Feel free to reach out to me if you have any further queries or concerns.

    Please 'Accept as answer' and ‘Upvote’ if it helped so that it can help others in the community looking for help on similar topics.

    0 comments No comments

  2. Alexander Fritsch 1 Reputation point

    We found a way that was not using Diagnostic IDs as @MughundhanRaveendran-MSFT suggested.

    Since our goal was to change as little about our setup as possible and simply attach a tool to run some insight queries, tracking something new was out of the question since that meant changing the actual system.

    The "smoothest" way we found was converting Application Insights to workspace-based, since that allowed us to use the Log Analytics. The traces were put into the "AppTraces" table, the correlating exceptions in the "AppExceptions" table. There is an article from Microsoft showing which properties match from the legacy Application Insights table on the new Log Analytics table. While it is definitely a good solution in the long run as workspace-based AI will be the standard, it will have to be tested thoroughly for us to actually use.

    For Authentication, we went with InteractiveBrowserCredential, since that was easiest to use after specifying the tenant ID we wanted to use.

    For querying, we used the LogsBatchQuery with two separate queries:

    $"AppTraces | where TimeGenerated between (datetime({dayEarlier}) .. 3d) | where Message has \"{messageId}\""
    for getting the traces we would get the Invocation ID from the Properties field to use in the next query. Limiting the time frame was important for the query to run with reasonable speed.

    $"AppExceptions | where TimeGenerated between (datetime({dayEarlier}) .. 3d) | where Properties has \"{invocationId}\""
    where we would actually get the exceptions from. We then just put it all together into a JSON file per message and could read that out.