Move data from SAP API (SAP ECC) and transfer to RESTAPI using Data factory witout storing on any storage

Dadheech, Raveesh 41 Reputation points
2022-12-09T11:16:10.887+00:00

Hi All,

Is there any way to move data from one APi (SAP API ) to another API using Data Factory /Logic App

without storing source api data anywhere (data is sensitive ).

I tried it using Data Activity but not understood how POST request will be processed as no option for request
body on SINK option .

Seeking your comments if it's feasible to do .

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,646 questions
{count} votes

2 answers

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,546 Reputation points Microsoft Employee Moderator
    2022-12-12T09:05:36.273+00:00

    Hi @Dadheech, Raveesh ,

    Thank you for posting query in Microsoft Q&A Platform.

    If I understand correctly, here you are using SAP ECC dataset as source and REST dataset as Sink and not able to understand how the request body to be sent to API in sink. If correct me if I am wrong.

    Please Note, REST connector as sink works with the REST APIs that accept JSON. The data will be sent in JSON with the following pattern. As needed, you can use the copy activity schema mapping to reshape the source data to conform to the expected payload by the REST API.

    [  
        { <data object> },  
        { <data object> },  
        ...  
    ]  
    

    So, kindly make use of mappings tab to reshape the data which get send to API. Click here to know more about REST connector as sink.

    Hope this helps. Please let me know if any further queries.

    -------------

    Please consider hitting Accept Answer button. Accepted answers help community as well.


  2. Dadheech, Raveesh 41 Reputation points
    2023-04-06T08:37:37.2566667+00:00
    Finally i got the solution by implementing pipeline with help of several activities .Just posting here so that it can help others if have same requirement (i just temporary stored data on blob storage and deleted on execution of pipelines get completed ) 
    
    Copy Data activity - Fetch and store data into blob storage (SINK) in CSV format (not json because csv option provide way of specifying max rows which help to store files in chunks of data inside a temporary created folder ex:Input_1234 that can be operated easily by metadata activity as used in next steps )
    Used metadata activity with childitems value in fields list option for fetching detail of CSV files as generated by step1
    Used foreach loop on basis on copy activity output
    
    Copy
       @if(greater(activity('Copyactivity1').output.rowsCopied,0) , activity('MetadataActivity').output.childItems,variables('varEmptyArr'))
       
       *varEmptyArr is a declared variable contain blank value , used in case of no record received from source
    
    Inside foreach i linked step3 to new copy activity for converting CSV data into Json and restored in blob storage on different folder (ex: output_1234)
    Then next activity added after step4 is lookup , which pick records from json file and tat can be used in next Web activity body option
    Add Web activity for posting data into rest api , used out put of step5 inside body option
    Use delete activity for deleting first json file whose records has been pushed to rest api in step 6
    Outside of foreach finally again used deleted activity for deleting csv all files from blob
    It's done , but used couple of variables for holding values of parameters which are upto you how to use it in pipeines.
    So finally i could say that using data factory you can transfer data from one SAP API/REST api to another restapi. :)
    
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.