Hello Michael Liver,
Greetings! Welcome to Microsoft Q&A Platform.
Azure Data Lake Storage supports the following authorization mechanisms:
- Shared Key authorization
- Shared access signature (SAS) authorization
- Role-based access control (Azure RBAC)
- Attribute-based access control (Azure ABAC)
- Access control lists (ACL)
Azure ABAC builds on Azure RBAC by adding role assignment conditions based on attributes in the context of specific actions. A role assignment condition is an additional check that you can optionally add to your role assignment to provide more refined access control. You cannot explicitly deny access to specific resources using conditions.
For more information on using Azure ABAC to control access to Azure Storage, see Authorize access to Azure Blob Storage using Azure role assignment conditions.
You can manage the access to containers, directories and blobs by using Access control lists (ACLs) feature in Azure Data Lake Storage Gen2.
You can associate a security principal with an access level for files and directories. Each association is captured as an entry in an access control list (ACL). Each file and directory in your storage account has an access control list. When a security principal attempts an operation on a file or directory, An ACL check determines whether that security principal (user, group, service principal, or managed identity) has the correct permission level to perform the operation.
refer this following doc for more detailed steps in creating the same - https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control, https://learn.microsoft.com/en-us/azure/storage/blobs/storage-auth-abac-portal.
I would like to highlight that users must have Reader permissions for storage in order to access storage account resources/containers. These permissions do not grant the ability to modify data in Azure storage.
I would suggest granting Reader permissions at the storage account level and Storage Blob data contributor permissions at the container level. This will enable users to access the specific container while maintaining restrictions on access to other containers within the storage account.
Assign an Azure role for access to blob data - Azure Storage | Microsoft Learn
Similarly, you can assign roles at the container level. When you open a container in the azure portal, you will see the blade where you can grant Storage Data Reader / Storage Data Contributor at the user or group level.
You can manage access at the container level by setting up access policies. Here’s how:
Go to your storage account in the Azure portal.
Find the container you want to restrict access to.
Under the settings blade, select “Access Policy.”
Click “Add Policy” and choose the permissions you want to grant for that specific container.
Keep in mind that public access levels (such as “Blob” or “Container”) are set at the container level, so you can control access there.
reference thread - https://learn.microsoft.com/en-us/answers/questions/606190/adls-container-level-access,
To grant access to a container, you can assign an RBAC role at the container scope or above to a user, group, service principal, or managed identity. You may also choose to add one or more conditions to the role assignment. You can read about the assignment of roles at Assign Azure roles using the Azure portal.
Also, folders in the Azure Blob storage are virtual. They look like folders, but they are not real folders just like the folders on your local computer.
If you need to grant access on folder level, you need to use Azure Data Lake Gen2 i.e. Azure Storage account where Hierarchical namespace setting is enabled. For existing storage account blob container/ folder: Access control lists (ACLs) in Azure Data Lake Storage Gen2
Hope this answer helps! Please let us know if you have any further queries. I’m happy to assist you further.
Please "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.