Hi @rajanisqldev-42 You can handle this using either Azure Data Factory (ADF) or Microsoft Fabric Pipelines, by exporting the varchar(max)
field (which should ideally contain Base64 or VARBINARY(MAX)
data) to Azure Blob Storage or Data Lake as individual files.
Here’s an approach using ADF:
Linked Services: Set up connections to both your SQL Server and Blob Storage. If your SQL Server is on-premises, make sure to use a Self-hosted Integration Runtime.
Datasets:
- Source: A SQL dataset that includes
DocumentBody
,Filename
, andMimeType
. - Sink: A binary dataset pointing to your target container in Blob or Data Lake.
Copy Activity:
- Map the
DocumentBody
column as binary content. - Use the
Filename
column to dynamically name your files. For example:
(You can also adjust the extension based on the@concat(item().Filename, '.pdf')
MimeType
if needed.)
Note: If you have different file types (such as PDFs, images, etc.), you can use an If Condition or Switch activity to set the correct file extension or path based on the MimeType
value.
Also, if the content is stored as Base64 rather than binary, you’ll need to decode it before writing it out. This can be done using either an Azure Function to convert Base64 to binary, or a Mapping Data Flow with a Derived Column for inline decoding.
Hope this helps. If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.