Unable to import json array to sql table in azure data factory

Sampath S 81 Reputation points
2022-08-30T07:47:16.247+00:00

I have a JSON array similar to this.

[
{
"heading":{
"row":[
{
"@captain " :"dhoni",
"@runs":"50"
},
{
"@captain " :"Kohli",
"@runs":"150"
}
]
}
}
]

But I am not sure how to add a sql script to import this JSON to my sql table. Can anyone help on this

Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,199 questions
SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,525 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. PeterSh 176 Reputation points
    2022-08-30T08:49:42.27+00:00

    You don't say how far along you are in getting this done, so I'll need to make some assumptions, the first being to assume you're using a pipeline with a simple 'Copy data' activity to drop the contents of the file into SQL.

    So, start out by creating your source dataset pointing to your input file. I'm going to assume you know how to do this, so you should have something similar to the following when you import schema.

    235997-2022-08-30-18-25-30-window.png

    From there, create your sink dataset pointing to your SQL server and destination table. Don't forget to click 'Auto create table' if you want to create the destination table.

    Click on the Mappings tab, and you should see something like this:

    236081-2022-08-30-18-29-11-window.png

    Click 'Import Schemas', and you should get something like the following:

    235987-2022-08-30-18-33-19-window.png

    This is starting to look promising - we can see that it is going to expose 'captain' and 'runs' columns.

    But if you look at the Name definition, you'll see it is defined as "$['heading']['row'][0]['captain']". The [0] indicates that it will only extract the first row (row zero). We don't want that.

    If you turn on the Advanced editor and choose an array from the Collection reference dropdown (in this case 'row'), you are telling the mapping that you want to unroll the array and map each array element to a row.

    235988-2022-08-30-18-37-40-window.png

    It should automatically translate those mappings for you, assuming you had the Advanced editor turned on. If you didn't, you can just choose these mappings manually.

    236026-2022-08-30-18-39-00-window.png

    You should now have mappings that look like the above, though I did set those Type entries manually.

    From there, save and run the pipeline, either with debug or by trigger, and it should happily drop the data in the sink table for you.

    236082-2022-08-30-18-47-50-window.png

    I hope that answers the question for you.

    2 people found this answer helpful.