Hello @Avishek Chowdhury ,
Thanks for the question and using MS Q&A platform.
Comments in line
1.Can it work with File as a source and API as destination? Meaning read from files, call api after massaging the data.(I know it can read from BLOB file storage, but API as destination?)
2.Let's say the answer to the above question is yes, after it has made the api call(s) successfully and based on the api response, some calls might be success, some are failure(not tech failure, but business validation failure), can it create segregated Success and Failure Reports/Files?(i.e., the source file has 10 records, 5 of them failed,5 success ,create one success file with 5 rec and one failure file with 5 files)
Yes it can . You can use the mapping data flow , eg JSON file and then pass the json to a API . It looks like you should go through the vbideo from Mark . https://learn.microsoft.com/en-us/azure/data-factory/data-flow-external-call
3.Chunking Capabilities-
-Does ADF support Chunking?
-For example, let's say the source file contains 10k records, and I want to
- Process 1000 records as Batch processing while reading and massaging,
-Chunk it further into 100 records while sending to downstream API, basically chunking at every steps
I think we will have to make some pipelines to implament this .
4.Does it also support low code basic transformations and validations, like if it is null/empty, replace with 0, something like that.
Yes its does . https://learn.microsoft.com/en-us/azure/data-factory/data-flow-transformation-overview
5.Saving last processedrecord at source and starting from immediate next in the event of failure:
-For example, there are 1000 records in a source file, ADF processed 200, it crashed for some reason, resolve it and when it started running again, it should start from record#201 rather than from 1 again.
The data processing is initiated by Triggers and yes if the triggers fails after some X records , you have the option to start from the last failure point
Also, is there any other solutions available in Azure which could potentially be a better fit candidate, if yes, why?
ADF / Mapping data flow does take care of the many ask . But may Azure function is also a worth a explore .
Please do let me if you have any queries.
Thanks
Himanshu
- Please don't forget to click on
or upvote
button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
- Want a reminder to come back and check responses? Here is how to subscribe to a notification
- If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
Hello @Avishek Chowdhury ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
Thanks
Himanshu