Événements
Créer des applications et des agents IA
17 mars, 21 h - 21 mars, 10 h
Rejoignez la série de rencontres pour créer des solutions IA évolutives basées sur des cas d’usage réels avec d’autres développeurs et experts.
S’inscrire maintenantCe navigateur n’est plus pris en charge.
Effectuez une mise à niveau vers Microsoft Edge pour tirer parti des dernières fonctionnalités, des mises à jour de sécurité et du support technique.
Many applications and services will log information to text files instead of standard logging services such as Windows Event log or Syslog. If this data is stored in JSON format, it can be collected by Azure Monitor using the Custom JSON Logs data source in a data collection rule (DCR). Details for the creation of the DCR are provided in Collect data with Azure Monitor Agent. This article provides additional details for the text logs type.
Avertissement
You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
The file that the Azure Monitor agent is collecting must meet the following requirements:
Adhere to the following recommendations to ensure that you don't experience data loss or performance issues:
Following is a sample of a typical JSON log file that can be collected by Azure Monitor. This includes the fields: Time
, Code
, Severity
,Module
, and Message
.
{"TimeGenerated":"2024-06-21 19:17:34","Code":"1423","Severity":"Error","Module":"Sales","Message":"Unable to connect to pricing service."}
{"TimeGenerated":"2024-06-21 19:18:23","Code":"1420","Severity":"Information","Module":"Sales","Message":"Pricing service connection established."}
{"TimeGenerated":"2024-06-21 21:45:13","Code":"2011","Severity":"Warning","Module":"Procurement","Message":"Module failed and was restarted."}
{"TimeGenerated":"2024-06-21 23:53:31","Code":"4100","Severity":"Information","Module":"Data","Message":"Nightly backup complete."}
The agent watches for any json files on the local disk that match the specified name pattern. Each entry is collected as it's written to the log and then parsed before being sent to the specified table in a Log Analytics workspace. The custom table in the Log Analytics workspace that will receive the data must exist before you create the DCR.
Any columns in the table that match the name of a field in the parsed Json data will be populated with the value from the log entry. The following table describes the required and optional columns in the table in addition to the columns identified in the Json data.
Column | Type | Required? | Description |
---|---|---|---|
TimeGenerated |
datetime | Yes | This column contains the time the record was generated and is required in all tables. This value will be automatically populated with the time the record is added to the Log Analytics workspace. You can override this value using a transformation to set TimeGenerated to a value from the log entry. |
Computer |
string | No | If the table includes this column, it will be populated with the name of the computer the log entry was collected from. |
FilePath |
string | No | If the table includes this column, it will be populated with the path to the log file the log entry was collected from. |
Create a data collection rule, as described in Collect data with Azure Monitor Agent. In the Collect and deliver step, select Custom JSON Logs from the Data source type dropdown.
The options available in the Custom JSON Logs configuration are described in the following table.
Setting | Description |
---|---|
File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas. Examples: - C:\Logs\MyLog.txt - C:\Logs\MyLog*.txt - C:\App01\AppLog.txt, C:\App02\AppLog.txt - /var/mylog.log - /var/mylog*.log |
Table name | Name of the destination table in your Log Analytics Workspace. |
Transform | Ingestion-time transformation to filter records or to format the incoming data for the destination table. Use source to leave the incoming data unchanged. |
JSON Schema | Columns to collect from the JSON log file and sent to the destination table. The columns described in Log Analytics workspace table that aren't required, do not need to be included in the schema of the destination table. TimeGenerated and any other columns that you added, do not need to be included in the schema of the destination table. |
Retrieving this data with a log query would return the following results.
The transformation potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of source
. If not, then modify the transformKql
section of tee ARM template with a KQL query that returns the required schema.
Go through the following steps if you aren't collecting data from the JSON log that you're expecting.
Learn more about:
Événements
Créer des applications et des agents IA
17 mars, 21 h - 21 mars, 10 h
Rejoignez la série de rencontres pour créer des solutions IA évolutives basées sur des cas d’usage réels avec d’autres développeurs et experts.
S’inscrire maintenantEntrainement
Module
Découvrez comment configurer et intégrer un agent Log Analytics à un espace de travail dans Defender pour le cloud en utilisant le portail Azure, ce qui améliore les fonctionnalités d’analyse des données de sécurité.
Certification
Microsoft Certified : Azure Cosmos DB Developer Specialty - Certifications
Écrivez des requêtes efficaces, créez des stratégies d’indexation, gérez et approvisionnez des ressources dans l’API SQL et le Kit de développement logiciel (SDK) avec Microsoft Azure Cosmos DB.