How to process rest payload from pipeline to data flow as source without blob storage

VeeraReddy Akkili 0 Reputation points
2024-01-23T01:31:22.91+00:00

Hi, azure functions app is making pipeline call by posting payload and pipeline receiving it . How do we send to data flow as source without writing to blob/ data lake. How can we validate json payload for mandatory elements in the json payload against json schema and not with lookup as there are 500 elements in the json payload.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
{count} votes

3 answers

Sort by: Most helpful
  1. AnnuKumari-MSFT 34,551 Reputation points Microsoft Employee Moderator
    2024-01-24T09:34:15.1733333+00:00

    Hi VeeraReddy Akkili ,

    Welcome to Microsoft Q&A platform and thanks for posting your query.

    As I understand your query, you want to load the json payload received from az function app to dataflow without loading it to blobstorage or datalake. It would be better if you could share some screenshot of the pipeline what you have created till now.

    To extract the json elements without the need to use lookup , you can use openJSON function in SQL . You can use script activity or lookup activity pointing the dataset to sql and use the below format to call openJson function:

    DECLARE @json NVARCHAR(MAX) = N'[  
      {  
        "Order": {  
          "Number":"SO43659",  
          "Date":"2011-05-31T00:00:00"  
        },  
        "AccountNumber":"AW29825",  
        "Item": {  
          "Price":2024.9940,  
          "Quantity":1  
        }  
      },  
      {  
        "Order": {  
          "Number":"SO43661",  
          "Date":"2011-06-01T00:00:00"  
        },  
        "AccountNumber":"AW73565",  
        "Item": {  
          "Price":2024.9940,  
          "Quantity":3  
        }  
      }
    ]'  
       
    SELECT *
    FROM OPENJSON ( @json )  
    WITH (   
                  Number   VARCHAR(200)   '$.Order.Number',  
                  Date     DATETIME       '$.Order.Date',  
                  Customer VARCHAR(200)   '$.AccountNumber',  
                  Quantity INT            '$.Item.Quantity',  
                  [Order]  NVARCHAR(MAX)  AS JSON  
     )
    

    Here instead of the hardcoded json , you can pass the 'az function activity output expression ' dynamically to the variable.

    Kindly find more details about openjson here: https://youtu.be/hfujOfmiDDM?list=PLsJW07-_K61JkuvlNfCK9gTEft_N8MVsP&t=687

    Hope it helps. Please let us know how it goes. Thankyou


  2. VeeraReddy Akkili 0 Reputation points
    2024-01-25T05:15:03.5733333+00:00

    image.png

    1. Receiving payload from Rest call which is calling run pipeline method.
    2. After receiving storing the payload in variable

    3.Generrating file name with pipeline runId.

    4.Storing payload into Blob storage with no escape characters

    5.Reading payload from blob storage and storing to cosmos DB.

    Questions:

    1. how can we insert into cosmos DB with out storing into blob storage as we received payload from rest call and dont want to store into blob storage as its redundent and not needed.
    2. is there any activity which validates json against json schema.
    3. we are sending payload to dataflow as variable/parameter . can we use this as source in data flow as payload is in pipeline and dont want to read from blob storage as source in dataflow.

  3. VeeraReddy Akkili 0 Reputation points
    2024-01-25T05:21:13.58+00:00

    just for validating payload, do we really make call to SQL. cant we do in Data factory as this is basic functionality.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.