This is a known behaviour in Azure Data Factory (ADF) where a branch exists whether or not any rows are fed to it, so the Sink will still run. In the case of storage sinks, a 0-byte blob will be created.
Here are a couple of suggestions that might help:
- Use Cache Sink: If you don’t wish to write output to an external source, you can use a cache sink. It writes data into the Spark cache instead of a data store. In mapping data flows, you can reference this data within the same flow many times using a cache lookup.
- Delete Empty Files Programmatically: If you still want to delete empty zero-byte files, you can use ADF or a programmatic method to delete them at the end of execution.
References:
https://stackoverflow.com/questions/68404511/zero-bytes-files-are-getting-created-by-adf-data-flow
Also, if my response helped resolve your issue, could you please mark this answer as accepted? Thank you!