What is $logs folder in storage account .. and does log analytics use this $logs folder..

Rohit Boddu 461 Reputation points

Hi Team..

I want to know difference between diagnostic settings preview and classic ..

When we enable diagnostic settings classic $logs gets created .. is there any dependency between this $logs folder and log analytics workspace..

where does log analytics workspace stores all log data .. does it use our storage account or storage will be provided from Microsoft side ..

What is the use of enabling classic diagnostic

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,272 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,270 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 42,516 Reputation points Microsoft Employee

    @Rohit Boddu Firstly, apologies for the delay in responding here and any inconvenience this issue may have caused.

    • All logs are stored in block blobs in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.windows.net/$logs. This container cannot be deleted once Storage Analytics has been enabled, though its contents can be deleted. If you use your storage-browsing tool to navigate to the container directly, you will see all the blobs that contain your logging data.
    • The $logs container is not displayed when a container listing operation is performed, such as the List Containers operation. It must be accessed directly. For example, you can use the List Blobs operation to access the blobs in the $logs container.
    • As requests are logged, Storage Analytics will upload intermediate results as blocks. Periodically, Storage Analytics will commit these blocks and make them available as a blob. It can take up to an hour for log data to appear in the blobs in the $logs container because the frequency at which the storage service flushes the log writers. Duplicate records may exist for logs created in the same hour. You can determine if a record is a duplicate by checking the RequestId and Operation number.

    1 What is the use of enabling classic diagnostic: You can collect diagnostic data like application logs, performance counters etc. from a Cloud Service using the Azure Diagnostics extension.

    2 Log Analytics Data will be stored where? : Data is stored in the region where the Log Analytics workspace is located. There is no other way way to access the data without executing query.

    3 $logs folder and log analytics workspace: Log Analytics is a tool in the Azure portal to edit and run log queries from data collected by Azure Monitor Logs and interactively analyze their results.

    • Metrics are converted to log form. This option may not be available for all resource types. Sending them to the Azure Monitor Logs store (which is searchable via Log Analytics) helps you to integrate them into queries, alerts, and visualizations with existing log data.
    • By default just metrics by the virtualization host are available to see more metrics and collect more information's you need to install agents, like:
      Log Analytics agent: collect logs and send data to a Log Analytics Workspaces
      Dependency agent: collect data about the processes running on the virtual machine and their dependencies.
      Azure Diagnostic Extension: collect guest performance data, like memory metrics.
      Telegraf agent: collect performance data from Linux VMs.

    Please let us know if you have any further queries. I’m happy to assist you further.


    Please do not forget to 155760-image.png and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments