I am looking to ingest features from our internal ARCGIS REST endpoint into an AZURE SQL table using a data pipeline. Everything went fine seemingly - was able to import the schema and map the source fields to the appropriate sink fields.
Sample source as follows:
{
"objectIdFieldName" : "OBJECTID",
"uniqueIdField" :
{
"name" : "OBJECTID",
"isSystemMaintained" : true
},
"globalIdFieldName" : "",
"geometryType" : "esriGeometryPoint",
"spatialReference" : {
"wkid" : 102100,
"latestWkid" : 3857
},
"fields" : [
{
"name" : "COMMENTS",
"type" : "esriFieldTypeString",
"alias" : "Comments",
"sqlType" : "sqlTypeOther",
"length" : 1024,
"domain" : null,
"defaultValue" : null
},
"alias" : "County",
"sqlType" : "sqlTypeOther",
"length" : 50,
"domain" : null,
"defaultValue" : null
},
{
"name" : "LOCATION",
"type" : "esriFieldTypeString",
"alias" : "Location",
"sqlType" : "sqlTypeOther",
"length" : 125,
"domain" : null,
"defaultValue" : null
},
{
"name" : "OBJECTID",
"type" : "esriFieldTypeOID",
"alias" : "OBJECTID",
"sqlType" : "sqlTypeOther",
"domain" : null,
"defaultValue" : null
}
],
"exceededTransferLimit" : true,
"features" : [
{
"attributes" : {
"COMMENTS" : "a comment",
"LOCATION" : "a location",
"OBJECTID" : 1
},
"geometry" :
{
"x" : -11331586.0513,
"y" : 3704997.8542
}
}
]
}
Now I am only mapping the feature.attributes to sink columns, so I would expect the output in the SQL table for this to read:
COMMENTS | LOCATION | OBJECTID
"a comment"|"a location"|1
However, it seems to be iterating through each field and then each attribute, so the result in the table is :
COMMENTS | LOCATION | OBJECTID
"a comment"|"a location"|1
"a comment"|"a location"|1
"a comment"|"a location"|1
IE, it has iterated through it 3 times ( one for each field). Is there a way to retrieve ONLY the values from the features array? Am i missing something?