@M R, Saravanan SOMWIPRO-SOMWIPRO , Thankyou for using Microsoft Q&A platform and thanks for posting your question here. In addition to the above suggestion by community, you can try the approach suggested in the below post to use python sdk in order to delete the files in ADLS: https://stackoverflow.com/questions/63475269/how-do-you-delete-a-file-from-an-azure-data-lake-using-the-python-sdk Kindly check and revert back by accepting the answer if it's helpful.
How to delete huge files in the ADLS Gen 1 folder using Data Factory
I'm trying to delete files in the ADLS folder using data factory delete activity. The pipeline fail with the following error below.
Failed to execute delete activity with data source 'AzureDataLakeStore' and error 'The request to 'Azure Data Lake Store' failed and the status code is 'BadRequest', request id is '858ea702-8d53-4e65-83ac-4cda95bca10b'. {"exception":"IllegalArgumentException","message":"Directory too large, Exceeded max enumerate entries 1000000. Please use pagination to list contents of this directory. [858ea702-8d53-4e65-83ac-4cda95bca10b][2023-03-31T00:14:28.5128250-07:00]","javaClassName":"java.lang.IllegalArgumentException"}} The remote server returned an error: (400) Bad Request.'. For details, please reference log file here:
Please recommend a solution for the issue.
2 answers
Sort by: Newest
-
-
Subashri Vasudevan 11,221 Reputation points
Apr 1, 2023, 11:14 AM Hi @M R, Saravanan SOMWIPRO-SOMWIPRO
You could try using AzCopy Remove command from azure portal, in Azure CLI.
Please check the syntax and usage here:
https://learn.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-remove
If that seems to work, you can call the same command using a batch activity from ADF.
kindly write back if that helped or you need further clarification on this.
BR,
Suba