@Martin
I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to accept the answer .
Ask: You decided to use the hive_metastore to create an external table by directly referring to the data lake URL. After setting up the Data Lake Key, you encountered errors when trying to observe the table in the Data Catalog. You suspect this is due to Table Access Control (TAC) not being enabled at the cluster level, despite enabling it in the Workspace and using a Shared cluster, which should have TAC enabled by default. You are seeking guidance on how to resolve these issues, particularly with enabling TAC at the cluster level and addressing the mounting and permission challenges.
Solution: The TAC issue got resolved the by creating the table with a Shared cluster instead of a single node cluster. However, the main problem was related to the Data Lake access key. It was fixed by defining the key in the Spark configuration as follows:
fs.azure.account.key.cdctestpymongo.dfs.core.windows.net{{secrets/<scope_name>/<your_secret_key_name>}}
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.