How to read data from blob storage using SAS authentication in Synapse Notebook

Jerry Pan 76 Reputation points
2022-12-27T17:47:51.183+00:00

My team needs to read data from blob storage using SAS in Synapse notebook. In this case, we cannot directly create a linked service using storage account key. We tried to create a linked service using SAS authentication as following. The SAS token has "Read" and "List" permission set and has no "Allowed IP addresses" specified.
274364-image.png
In the notebook we have our code like the following:

storage_container_name = 'test'  
  
blob_sas_token = mssparkutils.credentials.getConnectionStringOrCreds(storage_linked_service_name)  
  
storage_url = 'wasbs://%s@%s.blob.core.windows.net/source/sample.csv' % (storage_container_name, storage_account_name)  
  
spark.conf.set('fs.azure.sas.%s.%s.blob.core.windows.net' % (storage_container_name, storage_account_name), blob_sas_token)  
  
df_test = (spark.read.schema(staging_schema).csv(path=storage_url, escape='"', quote="\"", multiLine=True, header=True))  
df_test.show()  

However we the following error:

An error occurred while calling z:mssparkutils.credentials.getConnectionStringOrCreds.  
: java.lang.Exception: POST failed with 'Bad Request' (400) and message: {"result":"DependencyError","errorId":"BadRequest","errorMessage":"[Code=, Target=, Message=]. TraceId : 06af1132-63a2-4c41-80bd-71e9627976f7. Error Component : LSR"}  

We cannot find sample code for using SAS authentication in official documentation. We also don't know how to fix this error. Could you please advise how we can read data from blob storage via SAS authentication?

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,192 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
{count} vote

1 answer

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,546 Reputation points Microsoft Employee Moderator
    2023-01-12T08:19:55.01+00:00

    Hi @Jerry Pan ,

    Thank you for posting query in Microsoft Q&A Platform.

    If you would like to use SAS token to access storage, then try to have your SAS token in Key Vault and read it from there.

    Store your SAS token in Key vault secret and get that value in notebook.

    Below is the code to get secret value.

    mssparkutils.credentails.getSecret('<keyvalutName>','<secretName>')  
    

    Make sure, you configure access to Azure Key Vault for Synapse Notebook.

    Please Note,

    • Synapse notebooks, use Azure active directory (Azure AD) pass through to access Azure Key Vault.
    • Synapse pipelines use workspace identity (MSI) to access Azure Key Vault.
    • To make sure your code work both in notebook and in Synapse pipeline, grant secret access permissions for both your Azure AD account and workspace identity.

    Please check below video, which has complete demo of above.
    Configure access to Azure Key Vault for Synapse Notebook in Azure Synapse Analytics

    Hope this helps. Please let me know how it goes.


    Please consider hitting Accept Answer button. Accepted Answers help community as well.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.