@MrFlinstone Firstly, Apologies for the delay response!
Welcome to MIcrosoft Q&A Forum, Thank you for posting your query here!
To identify storage accounts that still use the classic Azure storage analytics, you can follow these steps:
Enable Storage Analytics: You need to enable Storage Analytics individually for each service you want to monitor. This can be done from the Azure portal, programmatically via the REST API, or using the client library. Use the Set Blob Service Properties, Set Queue Service Properties, Set Table Service Properties, and Set File Service Properties operations to enable Storage Analytics for each service
Check the Status: You can check the status of Storage Analytics logs (classic) for your storage accounts. This can be done through the Azure portal or programmatically. For more details, you can refer to the Microsoft Q&A forum
- Migrate Classic Storage Accounts: Microsoft will retire classic storage accounts on August 31, 2024. To preserve the data in any classic storage accounts, you must migrate them to the Azure Resource Manager deployment model by that date. This migration will allow you to take advantage of the benefits of the Azure Resource Manager deployment model
- Monitor and Manage Logs: You can use the Azure portal to configure and manage logs for your storage accounts. For more information on enabling and managing Azure Storage Analytics logs (classic), you can refer to the Microsoft Learn article
References
- get Azure Storage Analytics logs (classic) status - Microsoft Q&A
- How to migrate your classic storage accounts to Azure Resource Manager - Azure Storage | Microsoft Learn
- Enable and manage Azure Storage Analytics logs (classic) | Microsoft Learn
Option 2 : To identify storage accounts that still use the classic Azure storage analytics, you can check the Diagnostic settings (classic) for each storage account. If the Diagnostic settings (classic) are set to ON, then the storage account is still using the classic Azure storage analytics.
Steps to Complete the Migration
Here are the steps to complete the migration from classic Azure storage analytics to Azure Monitor metrics:
- Identify Classic Storage Accounts: Check the Diagnostic settings (classic) for each storage account. If set to ON, the storage account is using classic Azure storage analytics
- Enable Diagnostic Settings for Each Resource Type: On the storage account blade, ensure that diagnostics settings are enabled for blob, file, queue, etc., and configure them to send data to the new Log Analytics workspace
- Disable Diagnostic Settings (Classic): Set the Diagnostic settings (classic) to OFF
- Retain Logs: Ensure that the Delete data checkbox is selected and set the number of days for log data retention
Script to Complete the Migration
You can use Azure PowerShell or Azure CLI to create a script that completes these steps for a storage account. Here is an example using Azure PowerShell:
# Sign in to your Azure subscription
Connect-AzAccount
# Set the context to your subscription
$context = Get-AzSubscription -SubscriptionId <subscription-id>
Set-AzContext $context
# Get the storage account context
$storageAccount = Get-AzStorageAccount -ResourceGroupName "<resource-group-name>" -AccountName "<storage-account-name>"
$ctx = $storageAccount.Context
# Enable logging for read, write, and delete requests in the Queue service with retention set to five days
Set-AzStorageServiceLoggingProperty -ServiceType Queue -LoggingOperations read,write,delete -RetentionDays 5 -Context $ctx
# Disable logging for the table service
Set-AzStorageServiceLoggingProperty -ServiceType Table -LoggingOperations none -Context $ctx
This script will help you automate the process of enabling and disabling diagnostic settings for your storage accounts
If you have any further questions or need additional assistance, feel free to ask!
You can use the Azure PowerShell module to automate the above steps for multiple storage accounts. Here's an example PowerShell script that shows how to enable diagnostic settings for a storage account and send the diagnostic data to a Log Analytics workspace:
# Set the subscription and storage account names
$subscriptionName = "<your_subscription_name>"
$resourceGroupName = "<your_resource_group_name>"
$storageAccountName = "<your_storage_account_name>"
$logAnalyticsWorkspaceId = "<your_log_analytics_workspace_id>"
# Connect to Azure and select the subscription
Connect-AzAccount
Select-AzSubscription -SubscriptionName $subscriptionName
# Get the storage account and enable diagnostic settings
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
Set-AzStorageServiceLoggingProperty -Context $storageAccount.Context -ServiceType Blob -LoggingOperations "Read,Write,Delete" -RetentionEnabled $true -RetentionInDays 30
Set-AzStorageServiceLoggingProperty -Context $storageAccount.Context -ServiceType File -LoggingOperations "Read,Write,Delete" -RetentionEnabled $true -RetentionInDays 30
Set-AzStorageServiceLoggingProperty -Context $storageAccount.Context -ServiceType Queue -LoggingOperations "Read,Write,Delete" -RetentionEnabled $true -
Please let us know if you have any further queries. I’m happy to assist you further.
Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.