How can I upload a csv from a local drive to Lakehouse programmatically, so I can schedule the pickup?

Conner, Kyle 20 Reputation points
2024-04-04T14:38:45.8433333+00:00

Hi,

We are storing csvs on a local drive to support reporting and analysis. I would like to pull these csvs into the Lakehouse on a scheduled basis programmatically. All I can find so far is a manual upload. Can I code a Notebook, or leverage a Dataflow(Gen2) to pull the file into the Lakehouse, and have it scheduled as these files are updated in the source every day?

Only solutions I have seen is manual upload of csvs, and a Dataflow (Gen2) upload of Excel, neither will work to my knowledge.

Thanks,

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,363 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,444 questions
Microsoft Fabric Training
Microsoft Fabric Training
Microsoft Fabric: A Microsoft unified data platform.Training: Instruction to develop new skills.
12 questions
{count} votes

Accepted answer
  1. Vinodh247-1375 11,396 Reputation points
    2024-04-04T15:16:54.83+00:00

    Hi Conner, Kyle,

    Thanks for reaching out to Microsoft Q&A.

    Can I code a Notebook, or leverage a Dataflow(Gen2) to pull the file into the Lakehouse, and have it scheduled as these files are updated in the source every day?

    Yes, you can, both options will work. Set the pipeline to run daily at a specific time to keep your Lakehouse data up-to-date.

    Please 'Upvote'(Thumbs-up) and 'Accept' as an answer if the reply was helpful. This will benefit other community members who face the same issue.


0 additional answers

Sort by: Most helpful