Can we have 2 sequential for each activity in ADF pipeline?

Kalantri, Payal 0 Reputation points
2023-07-05T10:24:39.27+00:00

Hi Team ,

We are trying to load multiple tables form Oracle to PostgreSQL in one go. However we are facing case sensitive issues over here , we are trying sequential for each activity for this scenario , still not able to run entire pipeline. Please suggest how can we overcome this. Loading 200 tables one by one is definitely not sounding good. Your quick help is much appreciated.

Thanks,

Payal Kalantri

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,205 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,602 Reputation points Microsoft Employee
    2023-07-05T23:27:56.8933333+00:00

    @Kalantri, Payal Welcome to Microsoft Q&A forum and thanks for reaching out here.

    As per my understanding, there are 2 asks here

    1. `Using We are trying to load multiple tables form Oracle to PostgreSQL in one go. Loading 200 tables one by one is definitely not sounding good. Your quick help is much appreciated.**

    Yes, you can use 2 Sequential For Each for loading the multiple tables data, but I don't see a need here unless you have different schemas in your source. In case if you have multiple schemas, then you may use multiple ForEach activities in your pipeline.

    In addition, I'm not sure what was the specific need for sequential ForEach copy here. But if there is no specific need, I recommend use a single ForEach activity with parallel executions (default batch count will be 20 and you can increase that up to 50)

    1. However we are facing case sensitive issues over here , we are trying sequential for each activity for this scenario , still not able to run entire pipeline

    Since you haven't shared the error message here, we are not sure what was the exact issue. I would appreciate if you could share the complete error message.

    In case if you are having column name sensitivity issue (assuming you are doing auto mapping), then highly recommend using dynamic mapping in your copy activity for each table array of ForEach iteration to avoid the issue. Or you may explore using Mapping data flow to transform the data to desired or supported type before loading it to your sink to avoid the problem.

    Hope this helps. Do let me know if you have further questions.


    Please don’t forget to Accept Answer and Yes for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.