Databricks to Table storage Data load

Imran Mondal 246 Reputation points
2021-03-31T09:36:03.387+00:00

Hi Team,

Currently, I have data bricks spark jobs running which load data from Blob Storage and then process it using Databricks and then dump the clean data into another blob storage.

Now, I would like to dump the cleaned file to Table storage directly from data bricks job instead of the earlier process of dumping it to Blob Storage.

Please suggest .

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,224 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,218 questions
0 comments No comments
{count} votes

Accepted answer
  1. MartinJaffer-MSFT 26,096 Reputation points
    2021-04-01T19:01:25.263+00:00

    Hello @Imran Mondal and welcome to Microsoft Q&A.

    To my knowledge, Databricks does not natively support Azure Table Storage.

    To write to Table Storage from Databricks, you would need to leverage one of the following:

    • Python SDK
    • Install Library
    • Implement custom writer class
    • Use REST API for Table Storage

    https://learn.microsoft.com/en-us/azure/cosmos-db/table-support
    https://learn.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-python
    https://learn.microsoft.com/en-us/rest/api/storageservices/table-service-rest-api
    https://learn.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/foreach

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Pranam K 6 Reputation points
    2021-03-31T18:13:17.233+00:00

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.