You'll need a pre-step within the pipeline that identifies the tables to pull, you then use a For Each to loop through the list of tables passing a table a time down to your transfer process
Data Factory Pipeline Update

Nathan Carns
176
Reputation points
Hi.
I have a Data Factory pipeline that has a MariaDB as a source database and an Azure SQL database as destination. I want to be able to add new tables from the source without having to recreate the entire pipeline. How would I do this? Tables and columns are being added to the MariaDB and I'm having to delete and recreate the pipeline every time. I'm also deleting the tables inside the SQL DB. Any way to empty the database from tables and views easily?
Please help with this.
Thanks.
Accepted answer
Depends on what you're trying to do, is it every table within your database or just a predefined subset? If you do a google search for something like "data factory dynamic pipeline" there are a number of people who have blogged about approaches taken
I want to extract and update all the tables from a MariaDB database. Every time a new table is uploaded, a new column is added or rows are appended, I want it to automatically update the source dataset (or worst case scenario, I update the activity manually somehow).
I'll look into dynamic pipelines. Thanks.