post data into rest API using ADF

Dadheech, Raveesh 41 Reputation points
2023-02-21T12:11:18.72+00:00

Hi Team,

Here my requirement is to insert bulk records into rest-api using ADF, i have created pipeline and succeed by using these three steps of pipeline:

  1. Web Activity [Fetching data from another API ]
  2. Set variable [Store output of web activity into Array type variable]
  3. Inside For each added new Web activity to process value of Variable for each iteration . Done
  4. But this approach taking so much time in case of records are large [ i understood limit of variable and web activity but here just need help on speeding data transferring ]
  5. Question 1 : Can we post whole output of Array Variable into post request body in single attempt , as destination api support to take bulk records .
  6. Question 2 : Can we Pick specific fields from response of Step 1 , as it's an array and can not be stored into a string variable ,so not able to play with response before pushing into step 3 activity as @variable('data')
  7. Note : As i tried previously with copy activity also but i am surprised that pipeline get executed but records not executed into destination .
  8.                                            Thanks for your acknowledgment .
    

Your suggestion would be appreciated, struggling for this issue from last 5-7 weeks .

Sample of Request Input for destination API:

"operation": "create", 
"bulk": true,
"resourceName": "DestinationtableRest", 
"data": 
		[ {
         "A":"1"....},
          {
          "A": "2"         
          }
        ]
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Dadheech, Raveesh 41 Reputation points
    2023-04-06T08:35:46.6966667+00:00

    Finally i got the solution by implementing pipeline with help of several activities .Just posting here so that it can be help others if have same requirement (i just temporary stored data on blob storage and deleted on execution of pipelines get completed ) Activities

    1. Copy Data activity - Fetch and store data into blob storage (SINK) in CSV format (not json because csv option provide way of specifying max rows which help to store files in chunks of data inside a temporary created folder ex:Input_1234 that can be operated easily by metadata activity as used in next steps )
    2. Used metadata activity with childitems value in fields list option for fetching detail of CSV files as generated by step1
    3. Used foreach loop on basis on copy activity output
       @if(greater(activity('Copyactivity1').output.rowsCopied,0) , activity('MetadataActivity').output.childItems,variables('varEmptyArr'))
       
       *varEmptyArr is a declared variable contain blank value , used in case of no record received form source
    
    1. Inside foreach i linked step3 to new copy activity for converting CSV data into Json and restored in blob storage on different folder (ex: output_1234)
    2. Then next activity added after step4 is lookup , which pick records from json file and tat can be used in next Web activity body option
    3. Add Web activity for posting data into rest api , used out put of step5 inside body option
    4. Use delete activity for deleting first json file whose records has been pushed to rest api in step 6
    5. Outside of foreach finally again used deleted activity for deleting csv all files from blob
    6. It;s done , but used couple of variables for holding values of parameters which are upto you how to use it in pipeines.
    7. So finally i could say that using data factory you can transfer data from one SAP API/REST api to another restapi. :)
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.