Migrate to Innovate Summit:
Learn how migrating and modernizing to Azure can boost your business's performance, resilience, and security, enabling you to fully embrace AI.Register now
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Periodically, you can export logs and then upload them to Azure. Exporting and uploading logs also creates and updates the data controller, SQL managed instance, and PostgreSQL server resources in Azure.
Examples in this article use angle brackets < ... > to identify values that you need to replace before you run the script. Replace the brackets and the values inside the brackets.
Create a log analytics workspace
To create a log analytics workspace, execute these commands to create a Log Analytics Workspace and set the access information into environment variables.
Note
Skip this step if you already have a workspace.
Azure CLI
az monitor log-analytics workspace create --resource-group<resource group name>--workspace-name<some name you choose>
With the environment variables set, you can upload logs to the log workspace.
Configure automatic upload of logs to Azure Log Analytics Workspace in direct mode using az CLI
In the direct connected mode, Logs upload can only be set up in automatic mode. This automatic upload of metrics can be set up either during deployment or post deployment of Azure Arc data controller.
Enable automatic upload of logs to Azure Log Analytics Workspace
If the automatic upload of logs was disabled during Azure Arc data controller deployment, run the below command to enable automatic upload of logs.
Azure CLI
az arcdata dc update --name<name of datacontroller>--resource-group<resource group>--auto-upload-logstrue#Exampleaz arcdata dc update --name arcdc --resource-group<myresourcegroup>--auto-upload-logstrue
Enable automatic upload of logs to Azure Log Analytics Workspace
If the automatic upload of logs was enabled during Azure Arc data controller deployment, run the below command to disable automatic upload of logs.
az arcdata dc update --name <name of datacontroller> --resource-group <resource group> --auto-upload-logs false
#Example
az arcdata dc update --name arcdc --resource-group <myresourcegroup> --auto-upload-logs false
Configure automatic upload of logs to Azure Log Analytics Workspace in direct mode using kubectl CLI
Enable automatic upload of logs to Azure Log Analytics Workspace
To configure automatic upload of logs using kubectl:
ensure the Log Analytics Workspace is created as described in the earlier section
create a Kubernetes secret for the Log Analytics workspace using the WorkspaceID and SharedAccessKey as follows:
update the autoUploadLogs property to "false", and save the file
Upload logs to Azure Monitor in indirect mode
To upload logs for SQL Managed Instance enabled by Azure Arc and Azure Arc-enabled PostgreSQL servers run the following CLI commands-
Export all logs to the specified file:
Note
Exporting usage/billing information, metrics, and logs using the command az arcdata dc export requires bypassing SSL verification for now. You will be prompted to bypass SSL verification or you can set the AZDATA_VERIFY_SSL=no environment variable to avoid prompting. There is no way to configure an SSL certificate for the data controller export API currently.
Azure CLI
az arcdata dc export --type logs --path logs.json --k8s-namespace arc
Upload logs to an Azure monitor log analytics workspace:
Azure CLI
az arcdata dc upload --path logs.json
View your logs in Azure portal
Once your logs are uploaded, you should be able to query them using the log query explorer as follows:
Open the Azure portal and then search for your workspace by name in the search bar at the top and then select it.
Select Logs in the left panel.
Select Get Started (or select the links on the Getting Started page to learn more about Log Analytics if you are new to it).
Follow the tutorial to learn more about Log Analytics if this is your first time using Log Analytics.
Expand Custom Logs at the bottom of the list of tables and you will see a table called 'sql_instance_logs_CL' or 'postgresInstances_postgresql_logs_CL'.
Select the 'eye' icon next to the table name.
Select the 'View in query editor' button.
You'll now have a query in the query editor that will show the most recent 10 events in the log.
From here, you can experiment with querying the logs using the query editor, set alerts, etc.
Automating uploads (optional)
If you want to upload metrics and logs on a scheduled basis, you can create a script and run it on a timer every few minutes. Below is an example of automating the uploads using a Linux shell script.
In your favorite text/code editor, add the following script to the file and save as a script executable file - such as .sh (Linux/Mac), .cmd, .bat, or .ps1 (Windows).
Azure CLI
az arcdata dc export --type logs --path logs.json --force--k8s-namespace arc
az arcdata dc upload --path logs.json
Make the script file executable
Console
chmod +x myuploadscript.sh
Run the script every 20 minutes:
Console
watch -n 1200 ./myuploadscript.sh
You could also use a job scheduler like cron or Windows Task Scheduler or an orchestrator like Ansible, Puppet, or Chef.
Understand the monitoring capabilities of Azure Arc-enabled servers. Learn about the benefits of and how to onboard Azure Arc-enabled servers to Azure Logs and Metrics, VM Insights, and Azure Monitor Alerts.
Administer an SQL Server database infrastructure for cloud, on-premises and hybrid relational databases using the Microsoft PaaS relational database offerings.