Hi Scott Klein ,
Thanks for using Microsoft Q&A platform and thanks for posting your query here.
As per my understanding you are trying to split the columns from your on-prem SQL database and then load it into multiple files in Az blob storage using mapping dataflow.
Yes, your understanding is correct. Dataflow doesn't support self hosted IR. However, you can use inline dataset in dataflow that allows to connect with on-Prem SQL Server.
Other workaround is to load the table into blob storage using copy activity and then create multiple branches from the source transformation and use 'select transformation' to split your columns , attach it with multiple corresponding sinks to load it into different files .
Relevant document: https://techcommunity.microsoft.com/t5/azure-data-factory-blog/new-data-flow-connector-sql-server-as-source-and-sink/ba-p/2406213
Hope it helps. Kindly accept the answer by clicking on Accept answer
button. Thankyou