I have noticed a peculiar behavior with the ADF data flows.
While previewing the data to be written (I have set the debug row limit accordingly), it shows that 892 rows are to be written into my blob storage:
But, when I actually trigger the dataflow in a pipeline it writes nothing. The created files are empty.
The consumption after running the data flow:
The actual blob storage:
Also, depending on the data coming from the source system, it happens that it writes something. But there is always a mismatch of rows. The preview shows more rows than it actually writes. (I know for a fact, after comparing the source system with the target system, that this data still needs to be written.)
I have tried sorting the data beforehand and using single partition. Nothing has helped so far.