Kashif Ahmed: Can you check if the user has proper permissions on the storage account where he is trying to access the files from?
Storage Blob Data Contributor
= Use to grant read/write/delete permissions to Blob storage resources.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Could you please assist here to resolve this issue below ?
We are trying to read Delta table and export it as CSV using Data Flow. Reading Delta table via inline dataset from ADLS Gen2 and export as CSV (csv dataset) into Azure Blob Storage.
Getting below error while exporting as CSV at sink.
ERROR :
Error code : DFExecutorUserError
Failure type User: configuration issue
Details : Job failed due to reason: at Sink 'SinkSource': Spark job failed in one of the cluster nodes while writing data in one of the partitions to sink, with following error message: This request is not authorized to perform this operation using this permission.
Note : Dataflow fails, but CSV has been exported into temporary folders - https://<storage-account>.blob.core.windows.net/containername/foldername/_temporary/0/_temporary/ with different name like part0000werwe.csv inside _temporary location.
But we are excepting to export as csv inside foldername location , even though slink setting is File name option - output to single file ,file name - xyz.csv and Optimize Partition option : single.
Linked service (created using SAS) for Slink - Container level RWL privilege (Azure Blob Storage)
Could you please help us to resolve this issue.
Kashif Ahmed: Can you check if the user has proper permissions on the storage account where he is trying to access the files from?
Storage Blob Data Contributor
= Use to grant read/write/delete permissions to Blob storage resources.
@Vinodh247 : Yes we have access to read data from source side (we can preview the data), issue occurs while writing to Slink Azure Blob Storage. Linked service for slink based on the SAS token , token with container level read/write/list privilege.
The error states that there is permission issue when it tries to write into the spark partitions. Can you make sure the container path & folder name mentioned in the dataset exist?