How to set dynamic output file name in data flow

Kakehi Shunya (筧 隼弥) 201 Reputation points
2022-12-23T08:52:43.907+00:00

Hello.
I would like to dynamically set the name of the file to be output to the sink in the data flow, but I am getting an error.

{"message":"Job failed due to reason: at Sink 'sink1': File names cannot have empty value(s) while file name option is set as per partition. Details:","failureType":"UserError","target":"Data flow1","errorCode":"DF-Executor-InvalidPartitionFileNames"}  

Sink settings are described below.

・Output as a single file.
・The name of the file to output is "emp_list_yyyyyMMdd.parquet".
273628-image.png

concat('emp_list_',toString(toDate($as_of_date),'yyyy'),toString(toDate($as_of_date),'MM'),toString(toDate($as_of_date),'dd'),'.parquet')  

 *The parameter '$as_of_date' is the value of the parameter '$as_of_date'.
273627-image.png
For example, if the parameter passed is "2022-01-07T00:00:00", I want to output only one file, 'emp_list_20220107.parquet'.

・The parameter '$as_of_date' is a timestamp type, passed from the pipeline with the following settings
273676-image.png
273647-image.png

toTimestamp(left('@{item()}', 23), 'yyyy-MM-dd\'T\'HH:mm:ss.SSS')  
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,624 questions
{count} votes

1 answer

Sort by: Most helpful
  1. MarkKromer-MSFT 5,226 Reputation points Microsoft Employee Moderator
    2022-12-24T01:13:50.843+00:00

    It sounds like sometimes the parameter value is either empty or cannot convert to a valid timestamp. On you monitoring view above in ADF, hover over one of the failed data flow activities and click on the input icon. Observe the value of the input parameter being sent to the data flow activity to ensure that it is valid.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.