Hi, we are using our own Hadoop/Spark cluster where we are running some spark-sql queries.
My question is how can we set azure storage credential for a external table? instead of storing the credential in core-site.xml
For e.g.
CREATE EXTERNAL TABLE test_my_table ( t1 string, t2 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ' ' STORED AS TEXTFILE LOCATION
'wasbs://test-blob@storageAcccount.blob.core.windows.net/example/data'
This would give me an error
Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: org.apache.hadoop.fs.azure.AzureException org.apache.hadoop.fs.azure.AzureException: No credentials found for account
in the configuration, and its container test-blob is not accessible using anonymous credentials. Please check if the container exists first. If it is not publicly available, you have to provide account credentials.);
So how do I provide azure account key in create table statement? or in spark-sql ?