Sample data collection rules (DCRs) in Azure Monitor
This article includes sample data collection rules (DCRs) for different scenarios. For descriptions of each of the properties in these DCRs, see Data collection rule structure.
Note
These samples provide the source JSON of a DCR if you're using an ARM template or REST API to create or modify a DCR. After creation, the DCR will have additional properties as described in Structure of a data collection rule in Azure Monitor.
Azure Monitor agent - events and performance data
The sample data collection rule below is for virtual machines with Azure Monitor agent and has the following details:
- Performance data
- Collects specific Processor, Memory, Logical Disk, and Physical Disk counters every 15 seconds and uploads every minute.
- Collects specific Process counters every 30 seconds and uploads every 5 minutes.
- Windows events
- Collects Windows security events and uploads every minute.
- Collects Windows application and system events and uploads every 5 minutes.
- Syslog
- Collects Debug, Critical, and Emergency events from cron facility.
- Collects Alert, Critical, and Emergency events from syslog facility.
- Destinations
- Sends all data to a Log Analytics workspace named centralWorkspace.
Note
For an explanation of XPaths that are used to specify event collection in data collection rules, see Limit data collection with custom XPath queries.
{
"location": "eastus",
"properties": {
"dataSources": {
"performanceCounters": [
{
"name": "cloudTeamCoreCounters",
"streams": [
"Microsoft-Perf"
],
"scheduledTransferPeriod": "PT1M",
"samplingFrequencyInSeconds": 15,
"counterSpecifiers": [
"\\Processor(_Total)\\% Processor Time",
"\\Memory\\Committed Bytes",
"\\LogicalDisk(_Total)\\Free Megabytes",
"\\PhysicalDisk(_Total)\\Avg. Disk Queue Length"
]
},
{
"name": "appTeamExtraCounters",
"streams": [
"Microsoft-Perf"
],
"scheduledTransferPeriod": "PT5M",
"samplingFrequencyInSeconds": 30,
"counterSpecifiers": [
"\\Process(_Total)\\Thread Count"
]
}
],
"windowsEventLogs": [
{
"name": "cloudSecurityTeamEvents",
"streams": [
"Microsoft-Event"
],
"scheduledTransferPeriod": "PT1M",
"xPathQueries": [
"Security!*"
]
},
{
"name": "appTeam1AppEvents",
"streams": [
"Microsoft-Event"
],
"scheduledTransferPeriod": "PT5M",
"xPathQueries": [
"System!*[System[(Level = 1 or Level = 2 or Level = 3)]]",
"Application!*[System[(Level = 1 or Level = 2 or Level = 3)]]"
]
}
],
"syslog": [
{
"name": "cronSyslog",
"streams": [
"Microsoft-Syslog"
],
"facilityNames": [
"cron"
],
"logLevels": [
"Debug",
"Critical",
"Emergency"
]
},
{
"name": "syslogBase",
"streams": [
"Microsoft-Syslog"
],
"facilityNames": [
"syslog"
],
"logLevels": [
"Alert",
"Critical",
"Emergency"
]
}
]
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-resource-group/providers/Microsoft.OperationalInsights/workspaces/my-workspace",
"name": "centralWorkspace"
}
]
},
"dataFlows": [
{
"streams": [
"Microsoft-Perf",
"Microsoft-Syslog",
"Microsoft-Event"
],
"destinations": [
"centralWorkspace"
]
}
]
}
}
Azure Monitor agent - text logs
The sample data collection rule below is used to collect text logs using Azure Monitor agent. Note that the names of streams for custom text logs should begin with the "Custom-" prefix.
{
"location": "eastus",
"properties": {
"streamDeclarations": {
"Custom-MyLogFileFormat": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "RawData",
"type": "string"
}
]
}
},
"dataSources": {
"logFiles": [
{
"streams": [
"Custom-MyLogFileFormat"
],
"filePatterns": [
"C:\\JavaLogs\\*.log"
],
"format": "text",
"settings": {
"text": {
"recordStartTimestampFormat": "ISO 8601"
}
},
"name": "myLogFileFormat-Windows"
},
{
"streams": [
"Custom-MyLogFileFormat"
],
"filePatterns": [
"//var//*.log"
],
"format": "text",
"settings": {
"text": {
"recordStartTimestampFormat": "ISO 8601"
}
},
"name": "myLogFileFormat-Linux"
}
]
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-resource-group/providers/Microsoft.OperationalInsights/workspaces/my-workspace",
"name": "MyDestination"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-MyLogFileFormat"
],
"destinations": [
"MyDestination"
],
"transformKql": "source",
"outputStream": "Custom-MyTable_CL"
}
]
}
}
Event Hubs
The sample data collection rule below is used to collect data from an event hub.
{
"location": "eastus",
"properties": {
"streamDeclarations": {
"Custom-MyEventHubStream": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "RawData",
"type": "string"
},
{
"name": "Properties",
"type": "dynamic"
}
]
}
},
"dataSources": {
"dataImports": {
"eventHub": {
"consumerGroup": "<consumer-group>",
"stream": "Custom-MyEventHubStream",
"name": "myEventHubDataSource1"
}
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-resource-group/providers/Microsoft.OperationalInsights/workspaces/my-workspace",
"name": "MyDestination"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-MyEventHubStream"
],
"destinations": [
"MyDestination"
],
"transformKql": "source",
"outputStream": "Custom-MyTable_CL"
}
]
}
}
Logs ingestion API
The sample data collection rule below is used with the Logs ingestion API. It has the following details:
- Sends data to a table called MyTable_CL in a workspace called my-workspace.
- Applies a transformation to the incoming data.
Note
Logs ingestion API requires the logsIngestion property which includes the URL of the endpoint. This property is added to the DCR after it's created.
{
"location": "eastus",
"properties": {
"streamDeclarations": {
"Custom-MyTable": {
"columns": [
{
"name": "Time",
"type": "datetime"
},
{
"name": "Computer",
"type": "string"
},
{
"name": "AdditionalContext",
"type": "string"
}
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/cefingestion/providers/microsoft.operationalinsights/workspaces/my-workspace",
"name": "LogAnalyticsDest"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-MyTable"
],
"destinations": [
"LogAnalyticsDest"
],
"transformKql": "source | extend jsonContext = parse_json(AdditionalContext) | project TimeGenerated = Time, Computer, AdditionalContext = jsonContext, ExtendedColumn=tostring(jsonContext.CounterName)",
"outputStream": "Custom-MyTable_CL"
}
]
}
}
Workspace transformation DCR
The sample data collection rule below is used as a workspace transformation DCR to transform all data sent to a table called LAQueryLogs.
{
"location": "eastus",
"properties": {
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-resource-group/providers/Microsoft.OperationalInsights/workspaces/my-workspace",
"name": "clv2ws1"
}
]
},
"dataFlows": [
{
"streams": [
"Microsoft-Table-LAQueryLogs"
],
"destinations": [
"clv2ws1"
],
"transformKql": "source |where QueryText !contains 'LAQueryLogs' | extend Context = parse_json(RequestContext) | extend Resources_CF = tostring(Context['workspaces']) |extend RequestContext = ''"
}
]
}
}