NativeAzureFileSystem.delete function with recursive = true failed on a non-empty folder.
My configuration is working with many other operations including an empty folder. So it is not a config issue.
I'm running the latest version hadoop-azure-3.3.6.jar
Any suggestions on how to delete a non-empty folder?
Implementing the delete recursion version myself is not an option, as this delete method is also called internally from many Spark functions.
I'm showing here a simple example of the problem. This code deletes successfully an empty folder and it fails if it has any sub-folder with error: This operation is not permitted on a non-empty directory
JavaSparkContext jsc = getSparkContext();
//<container-name>,<storage-account>,<folder-path> replaced with my values
Path src = new Path("wasbs://<container-name>@<storage-account>.blob.core.windows.net/<folder-path>");
FileSystem fs = src.getFileSystem(jsc.hadoopConfiguration()); // fs is an object of NativeAzureFileSystem
This failed with error:
org.apache.hadoop.fs.azure.AzureException: com.microsoft.azure.storage.StorageException: This operation is not permitted on a non-empty directory.
Many people are asking about the same error but with different scenarios with no answer.
I believe it may be a bug in the library as it is internally calling NativeAzureFileSystem.deleteFile which doesn't take the recursive param. So I believe the delete recursively is not implemented correctly in this library.