Training
Module
Discover how to set up and integrate a Log Analytics agent with a workspace in Defender for Cloud using the Azure portal, enhancing security data analysis capabilities.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Custom Text Logs is one of the data sources used in a data collection rule (DCR). Details for the creation of the DCR are provided in Collect data with Azure Monitor Agent. This article provides additional details for the text logs type.
Many applications and services will log information to text files instead of standard logging services such as Windows Event log or Syslog. This data can be collected with Azure Monitor Agent and stored in a Log Analytics workspace with data collected from other sources.
The following diagram shows the basic operation of collecting log data from a text file.
The file that the Azure Monitor Agent is monitoring must meet the following requirements:
Adhere to the following recommendations to ensure that you don't experience data loss or performance issues:
Note
Multiline support that uses a time stamp to delimited events is now available. You must use a resource management template deployment until support is added in the Portal UI.
The incoming stream of data includes the columns in the following table.
Column | Type | Description |
---|---|---|
TimeGenerated |
datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace. You can override this value using a transformation to set TimeGenerated to another value. |
RawData |
string | The entire log entry in a single column. You can use a transformation if you want to break down this data into multiple columns before sending to the table. |
FilePath |
string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
Computer |
string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer with the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
Before you can collect log data from a text file, you must create a custom table in your Log Analytics workspace to receive the data. The table schema must match the data you are collecting, or you must add a transformation to ensure that the output schema matches the table.
Warning
To avoid data loss, it’s important that you do not use an existing custom log table that MMA agents are currently utilizing. Once any AMA agent writes to an existing custom log table, MMA agents will no longer be able to write to that table. Instead, you should create a new table specifically for AMA agents to ensure smooth transition from one agent to the next.
For example, you can use the following PowerShell script to create a custom table with RawData
, FilePath
, and Computer
. You wouldn't need a transformation for this table because the schema matches the default schema of the incoming stream.
$tableParams = @'
{
"properties": {
"schema": {
"name": "{TableName}_CL",
"columns": [
{
"name": "TimeGenerated",
"type": "DateTime"
},
{
"name": "RawData",
"type": "String"
},
{
"name": "FilePath",
"type": "String"
},
{
"name": "Computer",
"type": "String"
}
]
}
}
}
'@
Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourcegroup}/providers/microsoft.operationalinsights/workspaces/{WorkspaceName}/tables/{TableName}_CL?api-version=2021-12-01-preview" -Method PUT -payload $tableParams
Create a data collection rule, as described in Collect data with Azure Monitor Agent. In the Collect and deliver step, select Custom Text Logs from the Data source type dropdown.
Setting | Description |
---|---|
File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas. Examples: - C:\Logs\MyLog.txt - C:\Logs\MyLog*.txt - C:\App01\AppLog.txt, C:\App02\AppLog.txt - /var/mylog.log - /var/mylog*.log |
Table name | Name of the destination table in your Log Analytics Workspace. |
Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (/r/n ). |
Transform | Ingestion-time transformation to filter records or to format the incoming data for the destination table. Use source to leave the incoming data unchanged. |
Many text log files have entries that are delimited by a character such as a comma. To parse this data into separate columns, use a transformation with the split function.
For example, consider a text file with the following comma-delimited data. These fields could be described as: Time
, Code
, Severity
,Module
, and Message
.
2024-06-21 19:17:34,1423,Error,Sales,Unable to connect to pricing service.
2024-06-21 19:18:23,1420,Information,Sales,Pricing service connection established.
2024-06-21 21:45:13,2011,Warning,Procurement,Module failed and was restarted.
2024-06-21 23:53:31,4100,Information,Data,Nightly backup complete.
The following transformation parses the data into separate columns. Because split
returns dynamic data, you must use functions such as tostring
and toint
to convert the data to the correct scalar type. You also need to provide a name for each entry that matches the column name in the target table. Note that this example provides a TimeGenerated
value. If this was not provided, the ingestion time would be used.
source | project d = split(RawData,",") | project TimeGenerated=todatetime(d[0]), Code=toint(d[1]), Severity=tostring(d[2]), Module=tostring(d[3]), Message=tostring(d[4])
Retrieving this data with a log query would return the following results.
Go through the following steps if you aren't collecting data from the text log that you're expecting.
Learn more about:
Training
Module
Discover how to set up and integrate a Log Analytics agent with a workspace in Defender for Cloud using the Azure portal, enhancing security data analysis capabilities.