Azure Log Analytics Data Collector managed connection action dropping logs in bicep-created custom table

Mitch Abel 5 Reputation points
2024-01-24T20:00:45.6266667+00:00

Recently we have started using the managed connector for Azure Log Analytics Data Collector to send logs in our consumption Logic Apps. We find this to be a lot more useful than the built in logging which tends to be noisy, and inconsistent with tracked properties.

When a table in Log Analytics does not exist, it is created through ingestion which is fine most of the time, but we sometimes have a requirement to provision the table up front, which we can do in Bicep. However if we create the table first and send data to it, we get no error in the Logic App, but notice this error in the Log Analytics operations logs:

Data of type TEST_TABLE_BICEP was dropped: Custom log is V2, Workspace cannot be modified

After a bit of testing, I think the issue is with the managed connector using the now deprecated data collection API, even though the managed connector came out of preview not that long ago. Has anyone else struck this or found an elegant way around it? Ideally the product team updates the managed connector to use a non-deprecated API.

One work around we found was to have ingestion create the table, migrate the table to non-classic in the portal, redeploy the table via bicep and then sending logs works. But we would prefer to have the table provisioned up front without having to wait for some logs to come in.

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,182 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. Luis Arias 6,461 Reputation points
    2024-02-22T10:49:33.3333333+00:00

    Hi Mitch Abel, You’re correct the managed connector is using a deprecated API. A work aroung could be use the process to use Log Ingestion API:

    1. Create a Microsoft Entra Application for API authentication.
    2. Create a Data Collection Endpoint (DCE), a unique connection point for your Azure subscription.
    3. Create a Custom Table in a Log Analytics Workspace, the table you'll be sending data to.
    4. Give the AD Application Access to the DCR. The API call includes the Application (client) ID and Directory (tenant) ID of the application and the Value of an application secret.
    5. Send Data Using the Logs Ingestion API. The data sent from logic apps to the API formatted in JSON and match the structure expected by the DCR:
      1. Add an HTTP action in your Logic App
      2. Specify the HTTP method (POST), the URI for the Log Ingestion API, and headers or body content
      3. Provide the Application (client) ID, Directory (tenant) ID, and the Value of an application secret for authentication (Best Practice use Key vault )
      4. Save and run your Logic App. The HTTP action will send the log data to the Log Ingestion API

    I use this approach sending log from another resources to a custom tables and work perfecly fine.

    Additional resources:

    I hope this help you.

    Luis

    ---If the information helped address your question, please Accept the answer.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.