We are using Data Lake Gen 1 as the data store for HDInsight Cluster version 3.6. I wrote a simple spark code to write a file using saveas command of pyspark. The file is created in the DataLake with a write mask on. As a result. Any other user is not able to delete it. I don&amp;#39;t have owner access on datalake account but have the necessary read/write/execute access through default file permission setting. But since the write mask is set by the Service Principle used by the Cluster, i am unable to delete the file.
Is this an expected behavior. Is there a workaround to this.