Azure Pipeline Load Failure: Failed to execute dataflow with internal server error
Have been receiving this error for several days, attempt to re-run manually yields same result. Any guidance/insight on how to resolve?
Operation on target rptSalesDailyLoad failed: Operation on target Fact Sales Load failed: Operation on target df_factSales failed: {"StatusCode":"DF-Executor-InternalServerError","Message":"Job failed due to reason: at Sink 'FinalFactDel': Failed to execute dataflow with internal server error, please retry later. If issue persists, please contact Microsoft support for further assistance","Details":"org.apache.spark.SparkException: Job aborted due to stage failure: Task 11 in stage 100.0 failed 1 times, most recent failure: Lost task 11.0 in stage 100.0 (TID 4626) (vm-14295733 executor 1): ExecutorLostFailure (executor 1 exited caused by one of the running tasks) Reason: Container from a bad node: container_1692899079555_0001_01_000002 on host: vm-14295733. Exit status: 143. Diagnostics: [2023-08-24 18:40:18.981]Container killed on request. Exit code is 143\n[2023-08-24 18:40:18.982]Container exited with a non-zero exit code 143. \n[2023-08-24 18:40:18.997]Killed by external signal\n.\nDriver stacktrace:\n\tat com.microsoft.dataflow.FileStoreExceptionHandler$.extractRootCause(FileStoreExceptionHandler.scala:36)\n\tat com.microsoft.dataflow.transformers.DefaultFileWriter.$anonfun$write$16(FileStore.scala:1206)\n\tat scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)\n\tat scala.util.Try$.apply(Try.scala:213)\n\tat com.microsoft.dataflow.transformers.DefaultFileWriter.$anonfun$write$15(FileStore.scal"}