Unable to list the directory when create view for delta
Assuming, I have the a single spark dataframe outputted as DELTA into two folders using pyspark: folder A and folder B in the same blob storage Gen2
For Folder A and Folder B, both have external data source created with same credentials using sql script.
When creating a view for delta at Folder A, it is successful. [Folder A is the old folder that have been used.]
However, when creating a view for delta at Folder B, there is listing error.
/_delta_log/.' cannot be listed.
Since the view can be created for delta at Folder A, it means there is no issues on the delta files itself. The view for Folder B is to be created with a different name, so it shall not be a problem of conflicting view name.
It seems like permissions issues, but how could it happen since both folder A and B are within the same blob storage (same container), just different path.
Could it be firewall? How should I move on from here? Please advice.