Ray Chung , Greenwood, Justin -Administrator :
A record was created and included in incremental folder 1. The same record was later edited and included in incremental folder 2. Processing of incremental folder 1 failed, while processing of incremental folder 2 was successful. After a rerun, processing of incremental folder 2 was still successful. Which values of the record are current? Those from incremental folder 1 or 2
When Orchestrator Pipeline encountered unexpected error and failed for the given folder (Incremental folder 1). ThenDataverseToSQLPipelineProcessingLog
has entry of 0 (failure)
for the failed folder. Then subsequent folders (eg: Incremental folder 2) will be marked as 3 (skipped)
and not as successful (1)
. As the previous folder are not successful (status code <> 1
), hence all the subsequent folders will be skipped (Status code = 3)
.
To avoid this skipping folder loop, please follow below resolution steps:
- Identify & resolve the root cause of pipeline failure.
- Manually run the
DataverseToSQL
pipeline for the failed folder. - After successful execution of manual run, update the corresponding row in
DataverseToSQLPipelineProcessingLog
to1 (success)
. - Chronologically, sequentially & manually process the subsequent
skipped (3) folders
and manually update theDataverseToSQLPipelineProcessingLog
Status
column of corresponding rows to1 (success)
. - Once the failed folder & all the skipped folders are marked as successful, DataverseToSQL_Orchestrator pipeline will automatically process the next folder in next trigger.
You may have question that there are several skipped folders and manually executing them to updating
DataverseToSQLPipelineProcessingLog
is time consuming and error prone. Is there any better approach?
Recommendation:
Yes, in that case to avoid manual effort, please consider having new pipeline which:
- Retrieves the skipped folder chronologically.
- Executes the
DataverseToSQL
pipeline sequentially. - Updates Status column of
DataverseToSQLPipelineProcessingLog
rows to1 (success)
. Below is sample view of the pipeline looks like for processing skipped folders:
Greenwood, Justin -Administrator - From your earlier response I see that 594
folders were status <>1
. Since their status is <> 1, all subsequent folders will be skipped and will not execute the main pipeline Execute DataverseToSQLPipeline
. In order to overcome the issue, as I described above, please try to identify the reason why the initial folder are being skipped and reprocess them in chronological order and update the log status to 1 for those folders and once all the backlog is cleared, the subsequent folders will be copied automatically by the main Orchestrator pipeline.
"firstRow": {
"cnt": 594
},
Important Note: If the pipeline getting skipped without any failure, then it means that prior pipeline runs for previous folder are failed (status = 0) or skipped (status = 3) due to pipeline execution time overlapped as shown in below image.
To avoid the execution time overlapping, as I mentioned in my previous posts, it is necessary to set the concurrency = 1
for your orchestrator pipeline, that way only one execution will be in progress until completed and the subsequent pipeline runs will be queued and executed in the chronological order. Ensuring that Concurrency setting of Orchestrator pipeline is 1 will avoid this scenario in future. I will also provide feedback to the document/template owner to update the template in such a way that by default concurrency is set to 1 from the template gallery.
I hope this explains the scenarios why the folders will be skipped and how that impacts the subsequent folder runs and how to reprocess those skipped folder runs.
In case if this is still not clear, and you are blocked I would recommend you to please log a support ticket so that a support engineer can schedule a call and go through your pipeline history and will suggest the next steps to fix the problem.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.