Parsing a JSON File into CSV

Rohit 226 Reputation points
2022-10-11T12:51:02.613+00:00

249384-samplejson.txt

Hello All,
I have a below JSON File format

249349-image.png

I need to convert this JSON file into a CSV File with the below format

Name,Id,extraCredit
ABC,1,1
DEF,2,
xyz,3,5

Can someone help on this aspect
@Subashri Vasudevan @AnnuKumari-MSFT

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,196 questions
0 comments No comments
{count} votes

Accepted answer
  1. AnnuKumari-MSFT 32,161 Reputation points Microsoft Employee
    2022-10-11T17:47:12.65+00:00

    Hi @Rohit ,

    Thankyou for posting your question on Microsoft Q&A platform.

    Kindly perform the below steps to achieve the requirement:

    1. Point the source dataset to the input .json file and select Array of document as the Document form in JSON settings.

    249491-image.png

    2. Use Flatten transformation , select results.mappings.attributes[] in the unroll by option and select results.mappings in unroll root. Provide the input column details as below:

    249521-image.png

    3. Use Sink transformation and select csv as the format for dataset. In the sink settings, select 'output to single file' as the file name option and provide desired output filename . Also set single partition in optimize tab of sink .

    249463-image.png

    4. Call this dataflow within ADF pipeline and execute it.

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you.
      Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.

2 additional answers

Sort by: Most helpful
  1. David Broggy 5,701 Reputation points MVP
    2022-10-11T13:01:08.997+00:00

    Hi R-5499,
    There are several examples of parsing json to csv for ADF here in the Q&A forum
    eg:
    adf-parsing-json-column-values-in-csv-file.html

    If you've looked around and can't find exactly what you want then apologies and simply re-post your concerns here.

    good luck.

    0 comments No comments

  2. Rohit 226 Reputation points
    2022-10-13T13:17:25.623+00:00

    Hello @AnnuKumari-MSFT I am able to hit the REST API via LookUp Activity but querying the same API via dataflow is throwing the below error

    Spark job failed: {
    "text/plain": "{\"runId\":\"\",\"sessionId\":\"\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"com.microsoft.dataflow.broker.InvalidOperationException: DSL compilation failed: DF-DSL-001 - DSL stream has parsing errors\nLine 5 Position 11: mismatched input ''' expecting {DECIMAL_LITERAL, HEX_LITERAL, OCT_LITERAL, BINARY_LITERAL, MAX_INT, MIN_INT, MAX_LONG, MIN_LONG, POSITIVE_INF, NEGATIVE_INF, '-', '!', '$', '~', ':', '(', '#', '[', '@(', '[]', FLOAT_LITERAL, HEX_FLOAT_LITERAL, STRING_LITERAL, REGEX_LITERAL, 'parameters', 'functions', 'stores', 'as', 'input', 'output', 'constant', 'expression', 'integer', 'short', 'long', 'double', 'float', 'decimal', 'boolean', 'timestamp', 'date', 'byte', 'binary', 'integral', 'number', 'fractional', 'any', IDENTIFIER, ANY_IDENTIFIER, META_MATCH, '$$', '$$$', '$#', OPEN_INTERPOLATE}\nLine 11 Position 22: extraneous input ':' expecting {')', ','}\",\"detailedMessage\":\"Failure 2022-10-13 09:47:11.129 failed DebugManager.processJob, run=, errorMessage=com.microsoft.dataflow.broker.InvalidOperationException: DSL compilation failed: DF-DSL-001 - DSL stream has parsing errors\nLine 5 Position 11: mismatched input ''' expecting {DECIMAL_LITERAL, HEX_LITERAL, OCT_LITERAL, BINARY_LITERAL, MAX_INT, MIN_INT, MAX_LONG, MIN_LONG, POSITIVE_INF, NEGATIVE_INF, '-', '!', '$', '~', ':', '(', '#', '[', '@(', '[]', FLOAT_LITERAL, HEX_FLOAT_LITERAL, STRING_LITERAL, REGEX_LITERAL, 'parameters', 'functions', 'stores', 'as', 'input', 'output', 'constant', 'expression', 'integer', 'short', 'long', 'double', 'float', 'decimal', 'boolean', 'timestamp', 'date', 'byte', 'binary', 'integral', 'number', 'fractional', 'any', IDENTIFIER, ANY_IDENTIFIER, META_MATCH, '$$', '$$$', '$#', OPEN_INTERPOLATE}\nLine 11 Position 22: extraneous input ':' expecting {')', ','}\"}}\n"
    } - RunId:

    So do I have to configure anything special in the dataflow to avoid this error ?