"DFExecutorUserError: Job failed due to reason: None.get" | ADF, Data Flow, Snowflake
I continue to get the following error despite doing the following (below):
Operation on target `<DATAFLOW_NAME> failed: {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: None.get","Details":"java.util.NoSuchElementException: None.get\n\tat scala.None$.get(Option.scala:347)\n\tat scala.None$.get(Option.scala:345)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5$$anonfun$apply$18.apply(MetricsUtility.scala:345)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5$$anonfun$apply$18.apply(MetricsUtility.scala:342)\n\tat scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)\n\tat scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)\n\tat scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)\n\tat scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)\n\tat scala.collection.mutable.HashMap.foreach(HashMap.scala:130)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5.apply(MetricsUtility.scala:342)\n\tat com.microsoft.datafactory.dataflow.MetricsUtility$$anonfun$getMetricsForSink$5.apply(MetricsUtility.scala:296)\n\tat scala.collection.immutable.List."}
So far, I have:
- Googled extensively
- Verified data types are same in source data & sink dataset (Snowflake)
- Re-ran pipeline (both manual & via scheduled trigger); error persists
- Increased compute size
- Verified columns are mapping correctly; am using manual mapping, not auto because if I use auto, it doesn't work
- Recreated a new data flow using the same pipeline
I have multiple other pipelines that do not receive this error. Any suggestions before I open a ticket with Microsoft?