generic transformation

arkiboys 9,686 Reputation points
2021-09-16T20:19:05.223+00:00

Hello,
At present, the pipeline works fine where it reads a config file and based on the values of the config, it goes to the required sources and lands it into blobstorage containers accordingly.
For example, based on the parameter passed, around 50 filtered tables get transferred to blob storage containers...

What I would like to have is as follows:

From the container, each table to be transferred to the second container once the required fields have gone through the necessary transformations if required...
Doing the above perhaps means I can create a separate dataflow for each table and in those dataflows have the necessary checks to see if the fields need any transformations, etc.
Question, how can I do these checks/transformations generically so that I do not end up with so many dataflows and rather have one which handles all the transformations?
A long while back, I did the exact same thing in ssis using C# script task but not sure how similar tasks can be done in data factory.

Any suggestions?

Thank you

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,196 questions
{count} votes

1 answer

Sort by: Most helpful
  1. arkiboys 9,686 Reputation points
    2021-09-20T16:49:01.3+00:00

    Any transformation, for example, trimming, applying some logic or even checking for datatypes, etc.
    Thanks

    0 comments No comments