Is there limitation for nested JSON to show data types in dataset ADF?

Anuganti Suresh 200 Reputation points
2024-01-30T05:37:09.8533333+00:00
{
    "code": "SUCCESS",
    "dataflow": {
        "input": [
            {
                "persons": {
                    "Type": "person",
                    "id": "12345",
                    "user_id": "abcd",
                    "email": "hari@hotmail.com",
                    "type": "Internal",
                    "name": "hari",
                    "role": [
                        {
                            "role_sno": 11,
                            "role_name": "personal",
                            "role_type": "confidential"
                        }
                    ],
                    "is_access": true,
                    "display_id": "",
                    "delet": false,
                    "ability": false
                },
                "columns": [
                    {
                        "Type": "partition",
                        "group_id": "MANAGEMENT",
                        "type_id": "UPDATED",
                        "public_ip": "0.1.1.4",
                        "version": "05",
                        "platform": "net",
                        "browse": "google",
                        "model": "nxt",
                        "time": 1673999150117,
                        "modified": "2024-01-16T14:25:50Z",
                        "partition": {
                            "online": true
                        },
                        "partition_data": {
                            "message": "",
                            "status": false,
                            "start_time": 0,
                            "end_time": 0,
                            "online": true,
                            "in_meet": false
                        }
                    },
                    {
                        "Type": "base",
                        "group_id": "id",
                        "type_id": "LOGGED",
                        "public_ip": "199.166.14.76",
                        "version": "50",
                        "platform": "quera",
                        "browse": "explorer",
                        "model": "mac",
                        "time": 1673987691319,
                        "modified": "2023-01-17T20:34:51Z"
                    },
                    {
                        "Type": "total",
                        "group_id": "id",
                        "type_id": "LOGGED",
                        "public_ip": "1.16.11.12",
                        "version": "50",
                        "platform": "quera",
                        "browse": "explorer",
                        "model": "mac",
                        "time": 1673987691319,
                        "modified": "2023-08-17T20:34:51Z"
                    },
                    {
                        "Type": "totimestandard",
                        "group_id": "ids",
                        "type_id": "out",
                        "public_ip": "20.11.00.16",
                        "version": "55",
                        "platform": "zero",
                        "browse": "unix",
                        "model": "mac",
                        "time": 1673900091319,
                        "modified": "2023-11-17T20:34:51Z"
                    }
                ]
            }
        ]
    }
}


Dataset: is there any limitation to show data types of JSON?
If compare with above JSON, it read only upto Columns / partition / partition_data and remaining items it not displaying. dataset

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,804 questions
{count} votes

Accepted answer
  1. AnnuKumari-MSFT 33,241 Reputation points Microsoft Employee
    2024-01-30T08:39:44.2766667+00:00

    Hi Anuganti Suresh , Thankyou for using Microsoft Q&A platform and thanks for posting your query here. As I understand your issue, Full schema is not getting displayed when you Import schema for Json file in mapping dataflow in Azure . So , you want to ask if there is any limitation of the number of Jsons that can be imported. Please let me know if that is not the issue here.

    Dataflow always treats the first object of the array as the schema . So if you see inside columns[] array, the first json is taken as the schema. It expects that all the items of array should have same schema. If it's different , it will consider the first object schema as the schema of whole array.

    The issue here is not about the number of jsons that is showing up after importing the schema , it's about the mismatch in the schema of items of array. To test this, you can modify the json by adding another array as well, it will show up after importing schema , so the issue is not about any limitation of json size. It's related to the mismatch in schema which is not expected. Hope it helps. Kindly accept the answer by clicking on Accept answer button. Thankyou

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.