Hello Sourav ,
Thanks for reaching out in the Microsoft Q&A!
To efficiently transfer files from Azure Data Lake Storage (ADLS) to an AWS S3 bucket for your SAS application in batches, you have two main methods: Azure Data Factory (ADF) with Copy Activity and Rclone with an S3 adapter. With ADF, you create a pipeline orchestrating the transfer process, configure the copy activity, and schedule batches for transfer, ensuring data encryption via Azure Key Vault integration. Alternatively, Rclone offers a command-line approach, allowing configuration of the S3 adapter and batch transfer commands, with options for encryption at rest and in transit. Choose ADF for managed simplicity or Rclone for granular control, and ensure proper error handling and IAM configuration for security. By following these steps, you can seamlessly and securely transfer files between ADLS and AWS S3 for your SAS application. For more details, refer to the ADF documentation, Rclone installation guide, S3 adapter configuration, Azure Key Vault documentation, and AWS IAM documentation. If you find this helpful, please accept this answer to close the thread. Thanks!