Thanks for reaching MS Q&A
Extract Data from SQL Server On-Premises: You can use Azure Data Factory to create a pipeline that copies data from your SQL Server database to Azure Blob storage. You would need to create a self-hosted integration runtime, which moves data between on-premises and cloud data stores.
Historical Load: To load data before a specific date, you can add a filter in your query to only select records before the date 2023-12-17.
Dynamic Folder Creation: Azure Blob Storage doesn’t natively support folders. However, you can create a virtual directory structure by prefixing the blob name with the desired folder path. For example, you can name your blobs like year/month/day/tablename.parquet. This will give the illusion of them being stored in folders.
Full Load with Multiple Tables: You can use the Lookup activity to retrieve a list of table names from your control table. Then, use a ForEach activity to iterate over this list. Inside the ForEach loop, you can place a Copy activity that copies data from the current table in SQL Server to Azure Blob Storage.
Lookup and ForEach Activities: The Lookup activity can retrieve a dataset from any data source supported by Azure Data Factory and return the content of a configuration file or table. The ForEach activity defines a repeating control flow in your pipeline. It iterates over a collection and executes specified activities in a loop.
Hope this helps. Do let us know if you any further queries
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.