Hello @TS ,
Thanks for the question and using MS Q&A platform.
Why do you want to create copies of DBFS files to non-standard backing storage, say NFS, AWS S3, etc.?
It is a best practice not to store any data elements in the root Azure Blob storage that is used for root DBFS access for the workspace. That root DBFS storage is not supported for production customer data. However, you might store other objects such as libraries, configuration files, init scripts, and similar data. Either develop an automated process to replicate these objects, or remember to have processes in place to update the secondary deployment for manual deployment.
Your solution must replicate the correct data in both control plane, data plane, and data sources. Redundant workspaces for disaster recovery must map to different control planes in different regions.
For more details, refer to Azure Databricks - Disaster recovery.
Hope this will help. Please let us know if any further queries.
------------------------------
- Please don't forget to click on or upvote button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
- Want a reminder to come back and check responses? Here is how to subscribe to a notification
- If you are interested in joining the VM program and help shape the future of Q&A: Here is jhow you can be part of Q&A Volunteer Moderators