Pipeline failed when parsing json due to null values

braxx 426 Reputation points
2020-09-04T15:41:17.593+00:00

First of all, I am new in ADF world.

Recently got a task which I am still strugling with. I am trying to parse nested json stored in Blob Storage and copy the content to a table I had built in Azure Sql DB.

I created a pipeline and mapping but when running I am getting the following error:

Operation on target CopyJsonTo Sql failed: ErrorCode=UserErrorSchemaMappingCannotInferSinkColumnType,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Data type of column 'dimension1' can't be inferred from 1st row of data, please specify its data type in mappings of copy activity or structure of DataSet.,Source=Microsoft.DataTransfer.Common,'  

I guess it is because some of my data contains null values. here the json file sample:
22762-capture4.png

Here are also some screenshots from my pipline configuration for source, sink and mapping
22690-capture5.png

22781-capture6.png

22782-capture7.png

Thanks in advance. Any suggestion will be valuable to me

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,544 questions
0 comments No comments
{count} votes

Accepted answer
  1. MartinJaffer-MSFT 26,026 Reputation points
    2020-09-04T20:52:21.553+00:00

    Hello @BartoszWachocki-4076 and welcome to Microsoft Q&A and Data Factory. Thank you for your question.

    I was able to reproduce your issue... until I hooked up the SQL. However I think I know where to fix this anyway.
    First, go to your sink dataset
    22699-image.png
    and import schema
    22768-image.png
    Then go back to the copy activity mapping and import schemas again (to ensure it is up to date). If you click clear first, don't forget to check the collection reference.
    22744-image.png

    The idea behind this, is to make the mapping explicit to Data Factory, so it doesn't try to guess data type from the data.
    Please let me know if this solves your issue. If not, we can try the same on the source side, and if that doesn't work, let me know and I'll figure something out.

    Welcome and Thank you
    Martin


0 additional answers

Sort by: Most helpful