How do I partition a file path in synapse pipeline?

倉島 茂之 40 Reputation points

Question How do I partition the year and month for a file path? I tried specifying the insert path as sales_data/parquet/year = "yyyy"/month = "MM"/test.parquet, but it does not work. My situation is as below

  • Student Subscription
  • I use Azure Data Lake Storage Gen2.
  • I try to create a pipeline to convert CSV files to Parquet files.
  • The CSV file is named sales_data/csv/"yyyy"/"MM"/test.csv, where yyyy and MM are the actual years and months.
    • Example: sales_data/csv/2023/04/test.csv
  • I want to save the converted Parquet file in a path such as sales_data/parquet/"yyyy"/"MM"/test.parquet. User's image

User's image

User's image

User's image

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,454 questions
{count} votes

Accepted answer
  1. AnnuKumari-MSFT 31,716 Reputation points Microsoft Employee

    Hi 倉島 茂之 , Thankyou for using Microsoft Q&A platform and thanks for posting your question here. As I understand your issue, you are trying to copy multiple files having .csv format from one ADLS location to another in .parquet format. Please let me know if that is not the current understanding. The pipeline looks good except for the fact that the filename value for the source dataset is hardcoded as MOCK_Data.csv , despite having the requirement to copy multiple files as I understand. You could either go ahead using Wildcard file path in source to point to your container and specify correct wildcard pattern to fetch all the files needed and use preserve hierarchy option in the sink. or, You can use get metadata activity to fetch all the filenames present in all folders and iterate through them using ForEach activity and use copy activity within forEach. For similar scenarios , please check following videos: Wildcard filters in copy activity of ADF How to copy all the files from one ADLS folder to another in Azure Data factory Hope it helps. Kindly accept the answer if it's helpful . Thanks

    0 comments No comments

0 additional answers

Sort by: Most helpful