how to make ADF pull sharepoint files to azure blob storage

Sudha Achuthan 20 Reputation points
2024-06-18T11:14:45.5633333+00:00

hi

can someone tell me the steps to implement uploaded files into share point to be uploaded into azure blob storage.

anything that is easy to implement would be highly appreciated.

regards

sudha

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,199 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,639 questions
Microsoft 365 and Office | SharePoint | For business | Windows
{count} votes

1 answer

Sort by: Most helpful
  1. Nehruji R 8,181 Reputation points Microsoft External Staff Moderator
    2024-06-18T11:34:44.4466667+00:00

    Hello Sudha Achuthan,

    Greetings! Welcome to Microsoft Q&A Platform.

    To transfer uploaded files from SharePoint to Azure Blob Storage, there are several methods you can consider, as outlined in Microsoft's documentation:

    You can use Microsoft Power Automate: You can use Power Automate for copying files from a SharePoint folder to an Azure Blob folder, though this might have limitations. Power Automate has a maximum file size limit for individual files. This limit can vary based on the plan you are using, but typically, it's around 100 MB to 250 MB per file.

    You can also use Azure Data Factory: It is a cloud-based data integration service that allows you to create data pipelines that move and transform data between various sources and destinations, including SharePoint and Azure Blob Storage. You can create a pipeline that retrieves files from SharePoint using the SharePoint connector and then uses the Azure Blob Storage connector to upload the files to Azure Blob Storage. This approach requires some configuration and coding but provides more flexibility and scalability than Logic Apps.

    If there are extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.

    You can refer this article to use a binary dataset if you just want to copy the full file rather than read the data. https://datasavvy.me/2021/12/07/copying-large-files-from-sharepoint-online/

    Other reference that might help you: https://sharepains.com/2022/11/15/copy-large-files-sharepoint-azure-blob/

    Also check Azure Data Box: It is a physical device provided by Microsoft for offline data transfer. This method is suitable for large data sets and can significantly reduce the time it takes to transfer the data.

    You may also consider using third-party tools like Gs Richcopy 360, ShareGate, and Good Sync.

    Note: The third-party articles are recommended to help you find alternatives, Microsoft does not verify if the third-party workarounds work completely.

    Hope this answer helps! Please do let us know if you have any further queries. I’m happy to assist you further.

    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.