I'm fairly new to Azure Data Factory and have run up against the destination datasets, where each sink dataset must point to a single table. I have about 300 tables to handle overall, with various relationships and idiosyncrasies so a parameterised ForEach loop doesn't seem viable; I have additional sources coming soon and don't really want to have to set up potentially 1000 sink datasets.
To manage this I've been attempting to set up a stored procedure in an Azure SQL Server DB that will connect to an on-premise SQL Server instance, query a number of tables in the on-prem DB, and populate/update a number of other tables in the Azure DB. My thoughts are that if the SP is running in the Azure DB then it should be able to access and update those "local" tables, and all I need to work out is how to connect to the source (on-prem) DB from there.
I've got some basic ADF pipeline interaction happening - I've queried an on-prem table and populated a destination Azure table - but this is all within ADF where the source and destination datasets are set explicitly.
Are there any examples of an Azure SP querying the equivalent of a linked server? (OPENQUERY...?)
Thanks,
Phil