Hello Ashwini Gaikwad,
I see the below different methods that Synchronizing tables between two Azure Databricks workspaces
- creating a Hive external metastore that multiple Databricks workspaces can share. Each workspace can register and use this commonly shared metastore.
Please check the below document. Multiple scenario use cases are explained.
Reference document: https://techcommunity.microsoft.com/t5/fasttrack-for-azure/sharing-metadata-across-different-databricks-workspaces-using/ba-p/3679757
- If you have tables in one workspace that you want to move to another, consider using external tables as explained in the below databricks community forum.
- You can also use the DBSync project. This is an object synchronization tool that backs up, restores, and syncs Databricks workspaces.
https://github.com/databrickslabs/databricks-sync
- You can use ADF copy activity to copy the data from one Azure databricks to another databricks workspace.
I hope this helps. Please let me know if you have any further questions.