Copy the files from dbfs path in databricks to SharePoint location

AZ_Krishna Seshadry Gadepalli 20 Reputation points
2024-02-12T12:15:46.57+00:00

I have a databricks code to load the data into dbfs path from the dataframe looking for the python code where I need to copy the files available in the dbfs path to sharepoint location folder so that users can download the csv file extracts

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
Microsoft 365 and Office SharePoint For business Windows
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 38,546 Reputation points Microsoft Employee Moderator
    2024-02-13T05:26:33.24+00:00

    Hi AZ_Krishna Seshadry Gadepalli, Thank you for posting query in Microsoft Q&A Platform.

    You need to consider using Office365 SDKs for Python, which will help you to work with Sharepoint. > Consider writing python code to read files from dbfs and them load them to Sharepoint sites either by using Python SDKs or REST API calls in Python to Sharepoint.

    Below link has sample example of using Office365 SDK in Python for uploading file to sharepoint. https://plainenglish.io/blog/how-to-upload-files-to-sharepoint-using-python Hope this helps. Please let me know how it goes.


    Please consider hitting Accept Answer button. Accepted answers help community as well. Thank you.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.