Hello and welcome @minh tran . While your question is asking "how do I do ____" I'm afraid the reality isn't that simple. Please allow me the opportunity to do some education.
UPDATE: In my haste I thought this was regular JSON data, not schema. I will need to do some self-learning. My response below is not with respect to schemas.
You tagged this as azure-data-factory
. Can you tell me how your question ties in to Data Factory, or is this just about JSON with no tie-in?
JSON is a way of organizing and writing data. JSON in itself does not have means to enforce uniqueness, however properly written JSON should not have duplicate keys within the same scope and object. Enforcement of uniqueness should be done either by the writer or the reader.
To better highlight what I mean, let's use a JSON validator. For expediency, I am using https://jsonlint.com/
Examples below to be discussed:
invalid, duplicate key:
{
"a": 1,
"a": 2
}
valid, duplicate items in array:
{
"items": [
"a",
"a"
]
}
valid, objects in array with duplicate "id" property intended as unique:
{
"items": [{
"id": 123,
"value": "x"
},
{
"id": 123,
"value": "Q"
}
]
}
The first of these is technically wrong, but nothing is preventing you from writing that. This would be the closest to ensuring uniqueness. This is because when there are duplicate property keys, asking for their value becomes ambiguous -- which one. Quality JSON writers should implement checks to prevent this from happening.
The middle example demonstrates how arrays have no requirement that the contents be unique. This is because the arrays have an order, you can ask for an item by index.
In the bottom example, the "id" was intended to be unique by the user to make each object unique, but this is a fallacy. The keys must only be unique within the same scope. Here, each object is a separate scope, so the JSON is still valid, even if it doesn't do what the user intended.