Posting Business Central Json data from blob using adf

Kayode Ogidan 90 Reputation points
2024-05-24T03:26:30.43+00:00

Job failed due to reason: at Sink 'RESTAPIOutput': Job aborted due to stage failure: Task 3 in stage 108.0 failed 1 times, most recent failure: Lost task 3.0 in stage 108.0 (TID 148) (vm-0cf09028 executor 1): java.lang.NullPointerException

at org.apache.spark.sql.execution.datasources.rest.RestClient.executeSingleRowRequest(RestClient.scala:197)

at org.apache.spark.sql.execution.datasources.rest.RestClient.$anonfun$savePartitionSingle$5(RestClient.scala:163)

at org.apache.spark.sql.execution.datasources.rest.RestClient.$anonfun$savePartitionSingle$5$adapted(RestClient.scala:160)

at scala.collection.Iterator.foreach(Iterator.scala:941)

at scala.collection.Iterator.foreach$(Iterator.scala:941)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)

at org.apache.spark.sql.execution.datasources.rest.RestClient.$anonfun$savePartitionSingle$1(RestClient.scala:160)

at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)

at scala.util.Try$.apply(Try.scala:213)

at org.apache.spar  

@PRADEEPCHEEKATLA-MSFT @phemanth @Harishga Please help, I am posting data to business central, I have about 20 csv files in a copy activity, total input rows of 331, I am getting the error, what could be wrong, it fails on the sink

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,603 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.