Hello Mohammad Saber
Thank you for posting your query here!
I attempted to reproduce the inquiry and did not encounter any issues.
Hope this helps.
Prerequisites:
A Databricks Workspace.
A Databricks Cluster.
A new Notebook.
A Storage Account, a container with blobs added to it.
Now in your Databricks Notebook run the following mount command:
dbutils.fs.mount(
source = 'wasbs://container_name@storage_account_name.blob.core.windows.net/',
mount_point = '/mnt/your_mount_point’,
extra_configs = {'fs.azure.account.key.storage_account_name.blob.core.windows.net': 'key'}
)
Make sure to replace container_name, storage_account_name with the actual values.
For ‘/mnt/your_mount_point' give a name like ‘/mnt/blobstorage’
Replace key with Access Key of your Storage Account.
This mount command will attach your external Storage with your Databricks File System.
Once the cell is executed successfully, you will get a True message that indicates the mount point is created.
Now run the following command:
dbutils.fs.ls('/mnt/your_mount_point')
Make sure to replace your_mount_point with the actual value.
After the cell is executed, it should list down all the files inside the sample container.
Kindly let us know if you have any further queries. I’m happy to assist you further.
Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.