Creating multiple Data Collection Rules to return different files

Omar Navarro 326 Reputation points
2022-11-21T17:05:30.337+00:00

There are two text files which include log data at an Ubuntu machine. The intention is to send them via a data collection rule.

Is the recommended approach to apply two distinct data collection rules to an arc-enabled device linked to the same Data Collection Endpoint?

Create data collection rule

Azure IoT
Azure IoT
A category of Azure services for internet of things devices.
373 questions
Azure Arc
Azure Arc
A Microsoft cloud service that enables deployment of Azure services across hybrid and multicloud environments.
302 questions
{count} votes

Accepted answer
  1. LeelaRajeshSayana-MSFT 13,081 Reputation points
    2022-11-22T00:32:21.84+00:00

    Hi @Omar Navarro , thank you for posting this question.

    Apologies for the delayed feedback on this. I have investigated further into the implementation of the Data Collection Rule and discovered that it is possible to create a single rule to push data from both text files onto to the same Analytics end point. Please refer to the Data collection rule for text log section in the article which provides a Custom template sample showing multiple data sources bound to an end point. Please find part of the deployment snippet below

    "dataSources": {  
        "logFiles": [  
            {  
                "streams": [  
                    "Custom-MyLogFileFormat"  
                ],  
                "filePatterns": [  
                    "C:\\JavaLogs\\*.log"  
                ],  
                "format": "text",  
                "settings": {  
                    "text": {  
                        "recordStartTimestampFormat": "ISO 8601"  
                    }  
                },  
                "name": "myLogFileFormat-Windows"  
            },  
            {  
                "streams": [  
                    "Custom-MyLogFileFormat"  
                ],  
                "filePatterns": [  
                    "//var//*.log"  
                ],  
                "format": "text",  
                "settings": {  
                    "text": {  
                        "recordStartTimestampFormat": "ISO 8601"  
                    }  
                },  
                "name": "myLogFileFormat-Linux"  
            }  
        ]  
    },  
    "destinations": {  
        "logAnalytics": [  
            {  
                "workspaceResourceId": "[parameters('workspaceResourceId')]",  
                "name": "[parameters('workspaceName')]"  
            }  
        ]  
    },  
    "dataFlows": [  
        {  
            "streams": [  
                "Custom-MyLogFileFormat"  
            ],  
            "destinations": [  
                "[parameters('workspaceName')]"  
            ],  
            "transformKql": "source",  
            "outputStream": "Custom-MyTable_CL"  
        }  
    ]  
    

    Kindly note that, for this to work, the log files would have to follow a similar structure and can be transformed using a single "transformKql" before they get bound to the end point.

    Please factor in the Quotas and Limits and Design limitations and considerations as you create and utilize the data collection rule.
    264798-screenshot-19.png

    Please reach out to us if you have any further questions or need clarification on the information posted.

    ----------

    Kindly accept answer or upvote if the response is helpful so that it would benefit other community members facing the same issue. I highly appreciate your contribution to the community.

    0 comments No comments

0 additional answers

Sort by: Most helpful