Data flow sink shows that 892 rows are to be written but writes nothing to blob storage when triggered

Anxhela Merko 5 Reputation points
2023-04-04T11:44:04.24+00:00

I have noticed a peculiar behavior with the ADF data flows.

While previewing the data to be written (I have set the debug row limit accordingly), it shows that 892 rows are to be written into my blob storage:

enter image description here

But, when I actually trigger the dataflow in a pipeline it writes nothing. The created files are empty. The consumption after running the data flow:

enter image description here

The actual blob storage: blobStoragePath

Also, depending on the data coming from the source system, it happens that it writes something. But there is always a mismatch of rows. The preview shows more rows than it actually writes. (I know for a fact, after comparing the source system with the target system, that this data still needs to be written.) I have tried sorting the data beforehand and using single partition. Nothing has helped so far.

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,975 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,978 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. HimanshuSinha-msft 19,476 Reputation points Microsoft Employee
    2023-04-05T19:11:13.5533333+00:00

    Hello @Anxhela Merko , Thanks for the question and using MS Q&A platform. Thats kind of strange, the only think which stands out to me may be the Errors configuration . Can you please check on your end and see if that's set? User's image

    Himanshu Please accept as "Yes" if the answer provided is useful , so that you can help others in the community looking for remediation for similar issues. 


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.