Pipeline fails at data flow

Rose Kipgen 26 Reputation points Microsoft Employee
2020-07-30T08:51:55.913+00:00

I have created a 'copy data' and 'data flow' and have added them to my pipeline.
When debugging the pipeline, copy data runs successfully, however, it fails at data flow with error:
{"message":"at Sink 'OutputStreamName': java.util.NoSuchElementException: None.get. Details:at Sink 'OutputStreamName': java.util.NoSuchElementException: None.get","failureType":"UserError","target":"DataFlowName","errorCode":"DFExecutorUserError"}
Few things I want to highlight,
the copy operation is from CosmosDB to SQLDB and both the datasets are schema-less.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,814 questions
0 comments No comments
{count} votes

Accepted answer
  1. HarithaMaddi-MSFT 10,136 Reputation points
    2020-07-30T12:44:09.753+00:00

    Hi @Rose Kipgen ,

    Welcome to Microsoft Q&A and thanks for reaching out.

    We can use "Insert Explicit Structure" option in "Derived Column" transformation in Mapping dataflow that separates the fields from array in JSON and then they can be mapped to SQL columns in Sink as shown in GIF below.

    Hope this helps! Do let us know for queries.

    14876-cosmossqljsonload.gif


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.