@Pawan Singh - Thanks for the question and using MS Q&A platform.
The error message suggests that the specified filesystem does not exist. This could be due to a few reasons:
- The path to the file is incorrect. Double-check that the path is correct and that the file exists in the specified location.
- The file system is not mounted. If you are using Azure Data Lake Storage Gen2, you need to mount the file system before you can access the files. You can mount the file system using the
dbutils.fs.mount()
method. Here's an example:
dbutils.fs.mount(
source="abfss://<storage-account-name>@<file-system-name>.dfs.core.windows.net",
mount_point="/mnt/<mount-name>",
extra_configs={
"fs.adl.oauth2.access.token.provider.type": "ClientCredential",
"fs.adl.oauth2.client.id": "<application-id>",
"fs.adl.oauth2.credential": "<client-secret>",
"fs.adl.oauth2.refresh.url": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
}
)
Replace the placeholders with your own values. Once the file system is mounted, you can access the files using the mount point.
- The file system is not accessible. Check that you have the necessary permissions to access the file system. You may need to configure access control lists (ACLs) or role-based access control (RBAC) to grant access.
Once you have resolved the issue, you should be able to display the DataFrame using the display()
method.
For more details, refer to MS Q&A thread addressing similar issue: https://learn.microsoft.com/en-us/answers/questions/1341813/i-am-unable-to-mount-containers-using-databricks-a
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.