Azure databricks s3 SQS - Unable to load AWS credentials from environment variable
Hi Experts,
I am getting an error when trying to read the event message from the AWS S3 SQS in Azure Databricks.
here is my code
%python
schema_message = "message string"
df = spark.readStream \
.format("s3-sqs") \
.option("queueUrl", SQSQueue) \
.option("region",awsRegion) \
.option("fileFormat", "json") \
.option("awsaccessKeyId ", awsAccessKey) \
.option("awssecretKey", awsSecretKey) \
.schema(schema_message) \
.load()
Error Message :
com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [EnvironmentVariableCredentialsProvider: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)), SystemPropertiesCredentialsProvider: Unable to load AWS credentials from Java system properties (aws.accessKeyId and aws.secretKey), com.amazonaws.auth.profile.ProfileCredentialsProvider: profile file cannot be null, WebIdentityTokenCredentialsProvider:
Notes : The codes works fine if i set the AWS_ACCESS_KEY and AWS_SECRET_ACCESS_KEY variable values in the environment variables of the databricks cluster.
Any idea why the above codes is not working?