AVRO mapping
Applies to: ✅ Microsoft Fabric ✅ Azure Data Explorer
Use AVRO mapping to map incoming data to columns inside tables when your ingestion source file is in AVRO format.
Each element in the mapping list defines the mapping for a specific column. These elements are constructed from three properties: column
, datatype
, and properties
. Learn more in the data mappings overview.
Each AVRO mapping element must contain either of the following optional properties:
Property | Type | Description |
---|---|---|
Field | string |
Name of the field in the AVRO record. |
Path | string |
If the value starts with $ it's interpreted as the path to the field in the AVRO document that will become the content of the column in the table. The path that denotes the entire AVRO record is $ . If the value doesn't start with $ it's interpreted as a constant value. Paths that include special characters should be escaped as ['Property Name']. For more information, see JSONPath syntax. |
ConstValue | string |
The constant value to be used for a column instead of some value inside the AVRO file. |
Transform | string |
Transformation that should be applied on the content with mapping transformations. |
Note
Field and Path are mutually exclusive.
The following alternatives are equivalent:
[
{"Column": "event_name", "Properties": {"Path": "$.EventName"}}
]
[
{"Column": "event_name", "Properties": {"Field": "EventName"}}
]
Important
For queued ingestion:
- If the table referenced in the mapping doesn't exist in the database, it gets created automatically, given that valid data types are specified for all columns.
- If a column referenced in the mapping doesn't exist in the table, it gets added automatically to the table as the last column upon the first time data is ingested for that column, given a valid data type is specified for the column. To add new columns to a mapping, use the .alter ingestion mapping command.
- Data is batched using Ingestion properties. The more distinct ingestion mapping properties used, such as different ConstValue values, the more fragmented the ingestion becomes, which can lead to performance degradation.
Examples
[
{"Column": "event_timestamp", "Properties": {"Field": "Timestamp"}},
{"Column": "event_name", "Properties": {"Field": "Name"}},
{"Column": "event_type", "Properties": {"Field": "Type"}},
{"Column": "event_time", "Properties": {"Field": "Timestamp", "Transform": "DateTimeFromUnixMilliseconds"}},
{"Column": "ingestion_time", "Properties": {"ConstValue": "2021-01-01T10:32:00"}},
{"Column": "full_record", "Properties": {"Path": "$"}}
]
The mapping above is serialized as a JSON string when it's provided as part of the .ingest
management command.
.ingest into Table123 (@"source1", @"source2")
with
(
format = "AVRO",
ingestionMapping =
```
[
{"Column": "column_a", "Properties": {"Field": "Field1"}},
{"Column": "column_b", "Properties": {"Field": "$.[\'Field name with space\']"}}
]
```
)
Pre-created mapping
When the mapping is pre-created, reference the mapping by name in the .ingest
management command.
.ingest into Table123 (@"source1", @"source2")
with
(
format="AVRO",
ingestionMappingReference = "Mapping_Name"
)
Identity mapping
Use AVRO mapping during ingestion without defining a mapping schema (see identity mapping).
.ingest into Table123 (@"source1", @"source2")
with
(
format="AVRO"
)