Transfer documents from sqlserver in varbinary to Azure Blob storage

Reema DSouza 40 Reputation points
2024-05-23T16:42:11.0466667+00:00

I am looking to move a large set of documents from a Managed Instance of SQL Server database where the documents are saved in varbinary data type on the sql server table.

The ADF dataflow I created doesn't let me select a Azure Blob 'Binary' as the sink type.

Is there any alternative to achieve this?

I have also tried adding a Binary dataset on its own and pointing to it, but the Binary dataset doesn't populate.

Any pointers on how I can achieve this will be helpful.

User's image

User's image

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
13,961 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,814 questions
{count} votes

Accepted answer
  1. Harishga 5,990 Reputation points Microsoft Vendor
    2024-05-24T08:02:56.5133333+00:00

    Hi Reema DSouza

    Welcome to Microsoft Q&A platform and thanks for posting your question here.

    Convert Documents to Files in Blob Storage:

    • Retrieve the documents from your SQL Server database and convert them into individual files.
    • Upload these files to Azure Blob storage as blobs.
    • Use Azure Data Factory (ADF) to copy data from SQL Server to Azure Blob storage, but handle the conversion to files outside of ADF.
    • This approach gives you more control over the conversion process.

    Use Azure Table Storage Instead:

    • If you don’t specifically need to store the documents as binary blobs, consider using Azure Table Storage instead of Azure Blob storage.
      • In this case, you can directly use ADF to import data from your SQL database into Azure Table Storage.
    • This approach is simpler than the first one, but it may not be suitable if you need to store the documents as binary blobs.

    image

    Custom Solution with Azure Functions or Logic Apps:

    • In this approach, you would create an Azure Function or Logic App that retrieves the documents from SQL Server and uploads them to Azure Blob storage.
    • You can trigger this function or app periodically or based on specific events.
    • This approach gives you more flexibility in handling the conversion and upload process.
    • It requires more development effort than the other two alternatives.

    Azure CLI (azcopy):

    • Use the azcopy command-line tool to copy data to Azure Blob storage.
    • Replace <filepath> with the path to your varbinary data file, <containerpath> with the destination container path, and <SAStoken> with the shared access signature token for the container.
    • This approach is suitable if you prefer using a command-line tool.

    Please evaluate these options based on factors like security, performance, and scalability to choose the best fit for your specific scenario.

    Reference:
    https://stackoverflow.com/questions/51866703/copy-image-field-data-from-sql-server-to-azure-blob-storage-as-blob-blocks

    https://stackoverflow.com/questions/75235138/sql-server-varbinary-data-output-to-azure-blob

    https://learn.microsoft.com/en-us/azure/data-factory/tutorial-hybrid-copy-portal

    I hope this information helps you. Let me know if you have any further questions or concerns.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.