@Rohit Kulkarni - Thanks for the question and using MS Q&A platform.
To transfer data from Azure Blob Storage to External tables in Dedicated SQL Pool using Azure Data Factory, you can follow the below steps:
- Create a new pipeline in Azure Data Factory.
- Add a new "Copy Data" activity to the pipeline.
- In the "Source" tab of the "Copy Data" activity, select "Azure Blob Storage" as the source data store.
- Configure the connection to your Azure Blob Storage account and select the blob container and file(s) you want to copy data from.
- In the "Sink" tab of the "Copy Data" activity, select "Azure Synapse Analytics" as the sink data store.
- Configure the connection to your Dedicated SQL Pool and select the external table(s) you want to copy data to.
- Map the source and sink columns by selecting the appropriate columns from the source and sink schemas.
- Configure any additional settings for the "Copy Data" activity, such as data transformation or filtering.
- Save and publish the pipeline.
You can refer to the following Microsoft documentation for more detailed information on how to create a pipeline to transfer data from Azure Blob Storage to External tables in Dedicated SQL Pool using Azure Data Factory:
I hope this helps! Let me know if you have any further questions.