Hello @Mohammad Saber
Thanks for the question and using MS Q&A platform.
Azure Databricks provides comprehensive end-to-end diagnostic logs of activities performed by Azure Databricks users, allowing your enterprise to monitor detailed Azure Databricks usage patterns.
For a list of each of these types of events and the associated services, see Events. Some of the events are emitted in audit logs only if verbose audit logs are enabled for the workspace.
By default all the logs are available on the Azure Databricks cluster, if you want to download to local.
There are different ways to copy driver logs to your local machine.
Option1: Cluster Driver Logs:
Go to Azure Databricks Workspace => Select the cluster => Click on Driver Logs => To download to local machine.
The direct print and log statements from your notebooks and libraries goes to the driver logs. The logs have three outputs:
- Standard output
- Standard error
- Log4j logs
The log files are rotated periodically. Older log files appear at the top of the page, listed with timestamp information. You can download any of the logs for troubleshooting.
Option2: Cluster Log Delivery:
When you create a cluster, you can specify a location to deliver Spark driver and worker logs. Logs are delivered every five minutes to your chosen destination. When a cluster is terminated, Databricks guarantees to deliver all logs generated up until the cluster was terminated.
The destination of the logs depends on the cluster ID. If the specified destination is dbfs:/cluster-log-delivery, cluster logs for 0630-191345-leap375 are delivered to dbfs:/cluster-log-delivery/0630-191345-leap375.
To configure the log delivery location:
- On the cluster configuration page, click the Advanced Options toggle.
- At the bottom of the page, click the Logging tab.
- Select a destination type.
- Enter the cluster log path.
To Download the Cluster Logs to Local Machine:
Install the Databricks CLI, configure it with your Databricks credentials, and use the CLI's dbfs cp command. For example: dbfs cp dbfs:/FileStore/azure.txt ./azure.txt.
If you want to download an entire folder of files, you can use dbfs cp -r <DBFS Path> <LocalPath>
.
- Open cmd prompt.
- Install Python: https://www.python.org/downloads/
- Install Databricks:
pip install databricks-cli
- Copy the host and Generated access token. host: https://centralus.azuredatabricks.net/? token: 46XXXXXXXXXXXXXXXXXXXXXXXXXXghf12
- Databricks configure: Run the
datbricks configure
Databricks Host (should begin with https://): https://centralus.azuredatabricks.net/
Username: username@microsoft.com
Password: paste Access token
Repeat for confirmation: paste Access token
- Now Run the below cmdlet to copy logs to local machine
dbfs cp -r dbfs:/cluster-logs/0731-081420-tees851/driver C:\Users\Azure\Desktop\Logs
For more details, refer to Diagnostic logging in Azure Databricks
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.