HI @Sam-8373,
You can use the Union transformation in Data Flow of Azure Data Factory to copy data to your flatfile. To learn more, please take a look at :
https://learn.microsoft.com/en-us/azure/data-factory/data-flow-union
Azure Data Factory supports the following file formats. Refer to each article for format-based settings.
- Avro format
- Binary format
- Delimited text format
- JSON format
- ORC format
- Parquet format
Ref - https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system
To output/copy to a flat file like the example you have mentioned, you will have to create a sink Linked Service (equivalent to a connection) and a dataset on top of it (to keep track of the path, name and other properties of the outpur csv file). Depending on where you want to store your data, you would create an appropriate Linked Service. Eg - If you have an Azure Blob Storage account, you would have to create an Azure Blob Storage Linked Service. Similarly, if your csv output needs to go to a file system (local computer or a remote shared path), you would create a File System Linked Service.
In a similar manner, your source would be creating a Dataset pointing at the JSON source. The dataset itself would be built on top of a Linked Service that connects to the JSON file source. JSON format is supported for the following connectors: Amazon S3, Azure Blob, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure File Storage, File System, FTP, Google Cloud Storage, HDFS, HTTP, and SFTP.
Hope this helps.