Hello
I have a blob storage with json files comming from API and loaded there ones per day. The case is to parse the jsons and import to azure sql db on a daily basis. The jsons contains about 70 attributes with values. The structure is not always the same. Some files have slightly more attibutes then other. I do not want to import them all but have a fixed list which I need.
I am trying the approach used in this tutorial:
upsert-to-azure-sql-db-with-azure-data-factory
The problem is my json files are nested and have a lot o records. Here is an example (the first part as it is too big to show it fully)
{
"data": [
{
"id": "c5-jqrrb-dc5x6-yvkwc-8r30c-yyyy-xxxx",
"type": "products",
"attributes": {
"report-date": "2020-08-01",
"region": "DD",
"online-store": "Amazon",
"brand": "Gillette",
"category": "Shave Care",
"manufacturer": "ZZ",
"is-competitor": false,
"dimension1": null,
"dimension2": null,
"dimension3": "Male Shave Prep",
"dimension4": null,
"dimension5": null,
"dimension6": "YES",
"dimension7": null,
"dimension8": "UNKNOWN",
"trusted": {
"rpc": "B0XXX",
"upc": "03XXX",
"mpc": null,
"product-description": "Gillette Foamy Lemon Shaving Foam 200ml"
},
"harvested": {
"url": "https://www.xxx.com",
"product-image-url": "https://images-xxx_.jpg",
"video-count": null,
"gallery-video-count": null,
"image-count": null,
"duplicate-image-count": null
},
My question is. How to create a custom data type as show in tutorial for nested jsons?
Is it possible to do so?
Let's say I want import to sql db these 3 attributes from my sample json:
[id]
[types]
[attribute][report-date]
As the [attribute][report-date] is on the lower hierarchy how to reference this in the custom data type sql code?