Sự kiện
21 giờ 17 thg 3 - 10 giờ 21 thg 3
Tham gia chuỗi buổi gặp gỡ để xây dựng các giải pháp AI có thể mở rộng dựa trên các trường hợp sử dụng trong thế giới thực với các nhà phát triển và chuyên gia đồng nghiệp.
Đăng ký ngayTrình duyệt này không còn được hỗ trợ nữa.
Hãy nâng cấp lên Microsoft Edge để tận dụng các tính năng mới nhất, bản cập nhật bảo mật và hỗ trợ kỹ thuật.
Many applications and services will log information to text files instead of standard logging services such as Windows Event log or Syslog. This data can be collected by Azure Monitor using the Custom Text Logs data source in a data collection rule (DCR). Details for the creation of the DCR are provided in Collect data with Azure Monitor Agent. This article provides additional details for the custom text logs type.
Cảnh báo
You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
The file that Azure Monitor collects must meet the following requirements:
Adhere to the following recommendations to ensure that you don't experience data loss or performance issues:
The agent watches for any log files on the local disk that match the specified name pattern. Each entry is collected as it's written to the log and sent to the specified table in a Log Analytics workspace. The custom table in the Log Analytics workspace that will receive the data must exist before you create the DCR.
The following table describes the required and optional columns in the table. The table can include other columns, but they won't be populated unless you parse the data with a transformation as described in Delimited log files.
Column | Type | Required? | Description |
---|---|---|---|
TimeGenerated |
datetime | Yes | This column contains the time the record was generated and is required in all tables. This value will be automatically populated with the time the record is added to the Log Analytics workspace. You can override this value using a transformation to set TimeGenerated to a value from the log entry. |
RawData |
string | Yes1 | The entire log entry in a single column. You can use a transformation if you want to break down this data into multiple columns before sending to the table. |
Computer |
string | No | If the table includes this column, it will be populated with the name of the computer the log entry was collected from. |
FilePath |
string | No | If the table includes this column, it will be populated with the path to the log file the log entry was collected from. |
1 The table doesn't have to include a RawData
column if you use a transformation to parse the data into multiple columns.
For example, consider a text file with the following data.
2024-06-21 19:17:34,1423,Error,Sales,Unable to connect to pricing service.
2024-06-21 19:18:23,1420,Information,Sales,Pricing service connection established.
2024-06-21 21:45:13,2011,Warning,Procurement,Module failed and was restarted.
2024-06-21 23:53:31,4100,Information,Data,Nightly backup complete.
When collected using default settings, this data would appear as follows when retrieved with a log query.
Create a DCR, as described in Collect data with Azure Monitor Agent. In the Collect and deliver step, select Custom Text Logs from the Data source type dropdown.
The options available in the Custom Text Logs configuration are described in the following table.
Setting | Description |
---|---|
File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas. Examples: - C:\Logs\MyLog.txt - C:\Logs\MyLog*.txt - C:\App01\AppLog.txt, C:\App02\AppLog.txt - /var/mylog.log - /var/mylog*.log |
Table name | Name of the destination table in your Log Analytics Workspace. This table must already exist. |
Record delimiter | Indicates the delimiter between log entries. TimeStamp is the only current allowed value. This looks for a date in the format specified in timeFormat to identify the start of a new record. If no date in the specified format is found then end of line is used. |
timeFormat | The following time formats are supported. - yyyy-MM-ddTHH:mm:ssk (2024-10-29T18:28:34) - YYYY-MM-DD HH:MM:SS (2024-10-29 18:28:34) - M/D/YYYY HH:MM:SS AM/PM (10/29/2024 06:28:34 PM) - Mon DD, YYYY HH:MM:SS (October 29, 2024 18:28:34) - yyMMdd HH:mm:ss (241029 18:28:34) - ddMMyy HH:mm:ss (291024 18:28:34) - MMM d HH:mm:ss (Oct 29 18:28:34) - dd/MMM/yyyy:HH:mm:ss zzz (14/Oct/2024:18:28:34 -00) |
Transform | Ingestion-time transformation to filter records or to format the incoming data for the destination table. Use source to leave the incoming data unchanged and sent to the RawData column. |
Many text log files have entries with columns delimited by a character such as a comma. Instead of sending the entire entry to the RawData
column, you can parse the data into separate columns so that each can be populated in the destination table. Use a transformation with the split function to perform this parsing.
For example, consider a text file with the following comma-delimited data. These fields could be described as: Time
, Code
, Severity
,Module
, and Message
.
2024-06-21 19:17:34,1423,Error,Sales,Unable to connect to pricing service.
2024-06-21 19:18:23,1420,Information,Sales,Pricing service connection established.
2024-06-21 21:45:13,2011,Warning,Procurement,Module failed and was restarted.
2024-06-21 23:53:31,4100,Information,Data,Nightly backup complete.
The following transformation parses the data into separate columns. Because split
returns dynamic data, you must use functions such as tostring
and toint
to convert the data to the correct scalar type. You also need to provide a name for each entry that matches the column name in the target table. Note that this example provides a TimeGenerated
value. If this was not provided, the ingestion time would be used.
source | project d = split(RawData,",") | project TimeGenerated=todatetime(d[0]), Code=toint(d[1]), Severity=tostring(d[2]), Module=tostring(d[3]), Message=tostring(d[4])
Retrieving this data with a log query would return the following results.
Go through the following steps if you aren't collecting data from the text log that you're expecting.
Learn more about:
Sự kiện
21 giờ 17 thg 3 - 10 giờ 21 thg 3
Tham gia chuỗi buổi gặp gỡ để xây dựng các giải pháp AI có thể mở rộng dựa trên các trường hợp sử dụng trong thế giới thực với các nhà phát triển và chuyên gia đồng nghiệp.
Đăng ký ngayĐào tạo
Lộ trình học tập
Use advance techniques in canvas apps to perform custom updates and optimization - Training
Use advance techniques in canvas apps to perform custom updates and optimization