Share via

Azure data storage not ingesting files

Jeremy Hobbs-Awoyemi 0 Reputation points
2026-03-03T22:44:20.06+00:00

So guys i'm a novice trying to learn azure datafactory storage and I'm trying to load some data (in csv) into microsoft storage from data studio. I've created a dataflow, done some transformations, hooked it into some dataflow sinks which point to a dataset sink. Whenever I do it, however, the pipeline runs successfully and seems to export the data, but when I go into azure storage I cannot find the data. I can't even admit how many hours i've spent going back and fourth with copilot, ChatGBT & claude over this so this chat is my last hope before I admit defeat. I'll attach some screenshots below, any help is appreciated. Screenshot 2026-03-03 at 22.40.00

Screenshot 2026-03-03 at 22.39.28

Screenshot 2026-03-03 at 22.39.26

Screenshot 2026-03-03 at 22.39.22

Screenshot 2026-03-03 at 22.39.16

Screenshot 2026-03-03 at 22.39.07

Azure Storage
Azure Storage
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-03-03T22:44:38.9433333+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The pipeline and data flow are succeeding, so the issue is almost certainly where the sink datasets are pointing in Azure Storage versus where the files are being looked for.

    From the screenshots:

    • The sink datasets DS_processed_hospital_admissions_daily and DS_processed_hospital_admissions_weekly are DelimitedText datasets.
    • Each dataset has:
      • Linked service: an ADLS Gen2 account
      • File path split into folder and file parts, e.g.
        • Folder: hospitaladmissions
        • File: daily or weekly

    In mapping data flows, when a DelimitedText dataset is used as a sink with only a folder and a base file name, ADF writes CSV files into that folder with a generated name pattern (for example daily_*.csv, weekly_*.csv), not necessarily a single file exactly named daily.csv or weekly.csv.

    To locate the output:

    1. Open the same storage account that the linked service uses.
    2. Browse to the container configured in the linked service.
    3. Inside that container, open the hospitaladmissions folder.
    4. Look for files whose names start with daily or weekly and may include additional suffixes (timestamps, partitions, or GUIDs) and .csv extension.

    If nothing appears there, verify these points in the data flow sink settings:

    1. In the sink Settings tab, confirm the Output to path is not overridden by a dynamic expression; it should still reference the dataset path (hospitaladmissions/daily and hospitaladmissions/weekly).
    2. Ensure the sink is not configured to Truncate or Delete files after write (for example via a subsequent activity).
    3. Confirm the linked service points to the same storage account and container that is being checked in the portal.

    Once the correct container and folder are opened, the CSV files written by the sinks should be visible.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.