Error Setting Access Rights on ADL (Gen1) User Folders During HDInsight Cluster Creation

Christoph Kiefer 141 Reputation points
2020-09-22T07:21:55.527+00:00

Hello All

The issue occurs when the cluster is created. One of the last operations that Ambari performs are 'Post user creation hook for 1 users'. This fail with the attached errors.

Here is some more information:

  • It's an ESP cluster -- we have a group in AD called "UPC_CH_Data_Engineering" with users who are synced to the cluster.
  • The script fails while it tries to create user folders on ADL for these users.
  • We have an application that has access to the ADL service.
  • We have set the rights on the ADL root folder as shown in the attached picture.
  • I have attached stderr and stdout.

We don't know which changes we have to implement (either through Portal or script) to make it work. The goal is to have folders with correct rights for all the users on the ADL.

Any help is highly appreciated.

BR, Christoph

26393-ambari-d01upcchbisp1-post-user-creation-hook.png

26354-adl-rights-root-folder.png

26248-updea0-h8-hdi-application.png

26355-stderr.txt

26238-stdout.txt

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,499 questions
Azure HDInsight
Azure HDInsight
An Azure managed cluster service for open-source analytics.
215 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 90,246 Reputation points
    2020-09-24T09:21:38.297+00:00

    Hello @Christoph Kiefer ,

    You will receive this error message Exception occurred, Reason: Set Owner failed with failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.). when the user might have revoked permissions of service principal (SP) on files/folders.

    Steps to resolve this issue:

    Step1: Check that the SP has 'x' permissions to traverse along the path. For more information, see Permissions. Sample dfs command to check access to files/folders in Data Lake storage account:

    hdfs dfs -ls /<path to check access>  
    

    Step2: Set up required permissions to access the path based on the read/write operation being performed. See here for permissions required for various file system operations.

    For more details, refer “Azure HDInsight – Using Azure Data Lake Storage Gen1” and "Troubleshoot Data Lake Store files".

    Hope this helps. Do let us know if you any further queries.

    ----------------------------------------------------------------------------------------

    Do click on "Accept Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.