I have multiple CSV files in adls path, how to compress and decompressed these files with dynamic file names and current timestamp using dataflows in adf

Sivaramakrishna Mulagalapati 20 Reputation points
2024-09-18T06:07:55.5433333+00:00
  1. I have multiple csv files in adls location
  2. How to compress those files with dynamic file names and current timestamp
  3. After completing the compressed those files decompressed with dynamic file names and current timestamp again.
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,701 questions
{count} votes

Accepted answer
  1. Deepanshukatara-6769 9,355 Reputation points
    2024-09-18T06:33:35.8166667+00:00

    Hello, Welcome to MS Q&A

    To compress and decompress multiple CSV files in Azure Data Lake Storage (ADLS) with dynamic file names and current timestamps, you can use Azure Data Factory (ADF). Here’s a step-by-step guide:

    Compressing Files

    Create a Pipeline in ADF:

    • Use the Copy Activity to copy the CSV files from the source to the destination.
      • In the Sink settings of the Copy Activity, specify the Compression Type (e.g., GZip or Zip).
        • Use dynamic content to generate the file names with the current timestamp.
        Dynamic File Names with Timestamps
             @concat('compressed_', formatDateTime(utcnow(), 'yyyyMMddHHmmss'), '.zip')
             
      

    Decompressing Files

    Create Another Pipeline in ADF:

    By following these steps, you can efficiently compress and decompress your CSV files in ADLS with dynamic file names and timestamps using Azure Data Factory.

    Please let me know if any further questions

    Kindly accept answer if it helps

    Thanks

    Deepanshu


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.