Azure Data Flow Sink Error

Kashif Ahmed 40 Reputation points
2023-10-13T06:34:21.6966667+00:00

Could you please assist here to resolve this issue below ?

We are trying to read Delta table and export it as CSV using Data Flow. Reading Delta table via inline dataset from ADLS Gen2 and export as CSV (csv dataset) into Azure Blob Storage.

Getting below error while exporting as CSV at sink.

ERROR :

Error code : DFExecutorUserError

Failure type User: configuration issue

Details : Job failed due to reason: at Sink 'SinkSource': Spark job failed in one of the cluster nodes while writing data in one of the partitions to sink, with following error message: This request is not authorized to perform this operation using this permission.

Note : Dataflow fails, but CSV has been exported into temporary folders - https://<storage-account>.blob.core.windows.net/containername/foldername/_temporary/0/_temporary/ with different name like part0000werwe.csv inside _temporary location.

But we are excepting to export as csv inside foldername location , even though slink setting is File name option - output to single file ,file name - xyz.csv and Optimize Partition option : single.

Linked service (created using SAS) for Slink - Container level RWL privilege (Azure Blob Storage)

Could you please help us to resolve this issue.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,561 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,195 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,633 questions
{count} votes

Accepted answer
  1. Vinodh247 34,666 Reputation points MVP Volunteer Moderator
    2023-10-13T06:54:31.78+00:00

    Kashif Ahmed: Can you check if the user has proper permissions on the storage account where he is trying to access the files from?

    Storage Blob Data Contributor = Use to grant read/write/delete permissions to Blob storage resources.

    0 comments No comments

2 additional answers

Sort by: Most helpful
  1. Kashif Ahmed 40 Reputation points
    2023-10-13T07:05:54.4066667+00:00

    @Vinodh247 : Yes we have access to read data from source side (we can preview the data), issue occurs while writing to Slink Azure Blob Storage. Linked service for slink based on the SAS token , token with container level read/write/list privilege.

    0 comments No comments

  2. Vinodh247 34,666 Reputation points MVP Volunteer Moderator
    2023-10-13T08:20:31.3766667+00:00

    The error states that there is permission issue when it tries to write into the spark partitions. Can you make sure the container path & folder name mentioned in the dataset exist?

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.