I have created a Dataflow which takes data from CSV file from BlobStorage and insert into a table into Dataverse. While debugging it I am getting following error
Error: DFExecutorUserError
Job failed due to reason: None.get
Here is detailed error
Operation on target Data flow1 failed: {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: None.get","Details":"java.util.NoSuchElementException: None.get\n\tat scala.None$.get(Option.scala:347)\n\tat scala.None$.get(Option.scala:345)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5.apply(MetricsUtility.scala:307)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5.apply(MetricsUtility.scala:296)\n\tat scala.collection.immutable.List.foreach(List.scala:392)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$.getMetricsForSink(MetricsUtility.scala:296)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$.metricsToPayload(MetricsUtility.scala:464)\n\tat com.microsoft.datafactory.dataflow.AdmsClient.getMonitoringPayloadInternal(AdmsClient.scala:485)\n\tat com.microsoft.datafactory.dataflow.AdmsEventListener$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$8$$anonfun$apply$1.apply(AdmsEventListener.scala:90)\n\tat com.microsoft.datafactory.dataflow.AdmsEventListener$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$8$$anonfun$apply$1.apply(AdmsEventListener.scala:90)\n\tat sca"}
I also tried to export data as CSV from Sink and import into Dataverse , which is working fine. Same data I am not able to insert via pipeline.
Also not getting any relevant information regarding the error.
Failed Pipeline RunId: 75ffbe2f-eb38-4920-94a1-1e9c26a70c6d
Azure Data Factory Region : West Europe