Whenever you update a table schema, be sure to update any data collection rules that send data to the table. The table schema you define in your data collection rule determines how Azure Monitor streams data to the destination table. Azure Monitor does not update data collection rules automatically when you make table schema changes.
All tables in a Log Analytics workspace must have a column named TimeGenerated. If your sample data has a column named TimeGenerated, then this value will be used to identify the ingestion time of the record. If not, a TimeGenerated column will be added to the transformation in your DCR for the table. For information about the TimeGenerated format, see supported datetime formats.
Create a custom table
Azure tables have predefined schemas. To store log data in a different schema, use data collection rules to define how to collect, transform, and send the data to a custom table in your Log Analytics workspace. To create a custom table with the Auxiliary plan, see Set up a table with the Auxiliary plan (Preview).
Important
Custom tables have a suffix of _CL; for example, tablename_CL. The Azure portal adds the _CL suffix to the table name automatically. When you create a custom table using a different method, you need to add the _CL suffix yourself. The tablename_CL in the DataFlows Streams properties in your data collection rules must match the tablename_CL name in the Log Analytics workspace.
Warning
Table names are used for billing purposes so they should not contain sensitive information.
From the Log Analytics workspaces menu, select Tables.
Select Create and then New custom log (DCR-based).
Specify a name and, optionally, a description for the table. You don't need to add the _CL suffix to the custom table's name - this is added automatically to the name you specify in the portal.
Select an existing data collection rule from the Data collection rule dropdown, or select Create a new data collection rule and specify the Subscription, Resource group, and Name for the new data collection rule.
The transformation editor lets you create a transformation for the incoming data stream. This is a KQL query that runs against each incoming record. Azure Monitor Logs stores the results of the query in the destination table.
Select Run to view the results.
Select Apply to save the transformation and view the schema of the table that's about to be created. Select Next to proceed.
Verify the final details and select Create to save the custom log.
Use the Tables - Update PATCH API to create a custom table with the PowerShell code below. This code creates a table called MyTable_CL with two columns. Modify this schema to collect a different table.
Select the Cloud Shell button in the Azure portal and ensure the environment is set to PowerShell.
Copy the following PowerShell code and replace the Path parameter with the appropriate values for your workspace in the Invoke-AzRestMethod command. Paste it into the Cloud Shell prompt to run it.
There are several types of tables in Azure Monitor Logs. You can delete any table that's not an Azure table, but what happens to the data when you delete the table is different for each type of table.
Select the Cloud Shell button in the Azure portal and ensure the environment is set to PowerShell.
Copy the following PowerShell code and replace the Path parameter with the appropriate values for your workspace in the Invoke-AzRestMethod command. Paste it into the Cloud Shell prompt to run it.
You can modify the schema of custom tables and add custom columns to, or delete columns from, a standard table.
Note
Column names must start with a letter and can consist of up to 45 alphanumeric characters and underscores (_). _ResourceId, id, _ResourceId, _SubscriptionId, TenantId, Type, UniqueId, and Title are reserved column names.
Administer an SQL Server database infrastructure for cloud, on-premises and hybrid relational databases using the Microsoft PaaS relational database offerings.