How to pass a JSON payload in Azure Data Factory Data Flow? The payload format from Copy Transform throws parse error

Gautam Warrier 5 Reputation points
2023-10-25T01:29:57.1766667+00:00

Hello,

I have an API to which I can connect and retrieve data from, using the Copy activity/transform in Azure Data Factory. It takes a JSON payload in the request body.

{
     "filters": {
        "from":"2023-10-13T18:00:00.000000Z",        
        "to":"2023-10-13T18:59:59.999999Z"
    }
 }

However, since the response is a complex JSON which needs to be flattened multiple times, I'm trying to implement the same using Mapping Data Flow, instead of Copy.

For some reason, in the Mapping Data Flow the Source setting doesn't seem to process the JSON request body payload in the same way - it throws a generic parse error (pasted below). Is there a different/particular format in which the request body needs to be provided in the Data Flow?

Spark job failed: { "text/plain": "{\"runId\":\"22e7ce53-e63d-45c4-a4f4-4db42a05c26a\",\"sessionId\":\"b17813c2-1869-41a4-9802-e34c7272cf99\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"com.microsoft.dataflow.broker.InvalidOperationException: DSL compilation failed: DF-DSL-001 - DSL stream has parsing errors\\nLine 4 Position 11: body: '\\nmismatched input ''' expecting {DECIMAL_LITERAL, HEX_LITERAL, OCT_LITERAL, BINARY_LITERAL, MAX_INT, MIN_INT, MAX_LONG, MIN_LONG, POSITIVE_INF, NEGATIVE_INF, '-', '!', '$', '~', ':', '(', '#', '[', '@(', '[]', FLOAT_LITERAL, HEX_FLOAT_LITERAL, STRING_LITERAL, REGEX_LITERAL, 'parameters', 'functions', 'stores', 'as', 'input', 'output', 'constant', 'expression', 'integer', 'short', 'long', 'double', 'float', 'decimal', 'boolean', 'timestamp', 'date', 'byte', 'binary', 'integral', 'number', 'fractional', 'any', IDENTIFIER, ANY_IDENTIFIER, META_MATCH, '$$', '$$$', '$#', OPEN_INTERPOLATE}\",\"detailedMessage\":\"Failure 2023-10-24 23:15:23.674 failed DebugManager.processJob, run=22e7ce53-e63d-45c4-a4f4-4db42a05c26a, errorMessage=com.microsoft.dataflow.broker.InvalidOperationException: DSL compilation failed: DF-DSL-001 - DSL stream has parsing errors\\nLine 4 Position 11: body: '\\nmismatched input ''' expecting {DECIMAL_LITERAL, HEX_LITERAL, OCT_LITERAL, BINARY_LITERAL, MAX_INT, MIN_INT, MAX_LONG, MIN_LONG, POSITIVE_INF, NEGATIVE_INF, '-', '!', '$', '~', ':', '(', '#', '[', '@(', '[]', FLOAT_LITERAL, HEX_FLOAT_LITERAL, STRING_LITERAL, REGEX_LITERAL, 'parameters', 'functions', 'stores', 'as', 'input', 'output', 'constant', 'expression', 'integer', 'short', 'long', 'double', 'float', 'decimal', 'boolean', 'timestamp', 'date', 'byte', 'binary', 'integral', 'number', 'fractional', 'any', IDENTIFIER, ANY_IDENTIFIER, META_MATCH, '$$', '$$$', '$#', OPEN_INTERPOLATE}\"}}\n" } - RunId: 22e7ce53-e63d-45c4-a4f4-4db42a05c26a

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 34,566 Reputation points Microsoft Employee Moderator
    2023-10-25T09:50:31.7366667+00:00

    Hi Gautam Warrier ,

    Welcome to Microsoft Q&A platform and thanks for posting your question here.

    As per my understanding , you are getting an exception "DF-DSL-001 - DSL stream has parsing errors" while trying to process JSON body payload. Please let me know if that is not the correct understanding.

    It seems that there is some issue with how the json is formatted while sending the request .

    Kindly confirm if you are passing the request body as parameter in the dataflow?

    Probably you need to use escape characters i.e. '' symbol so that it will pass the single quotes in proper format.

    For more details, kindly check out the below video: Write dynamic expression for SQL with single quotes around parameter value in Mapping data flows

    Hope it helps. Kindly accept the answer by clicking on Accept answer button. Thankyou.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.