How to mount ADLSgen2 to Synapse notebook via linked service?

Peter Michalik 20 Reputation points
2024-02-08T20:32:46.7433333+00:00

I have created linked service for ADLS gen2. I chose authentication type SAS URI - when click on test connection - everything is fine. When I use that linked service in notebook it does not work because of this error:

"An error occurred while calling z:mssparkutils.fs.mount. : com.microsoft.spark.notebook.msutils.InvalidCredentialsException: fetch Token from linkedService failed with POST failed with 'Bad Request' (400) and message: {"result":"DependencyError","errorId":"BadRequest","errorMessage":"[Code=400, Target=ADLS_personal, Message=Missing required property 'url' on ADLS_personal]. TraceId : ff2af9cc-4f01-45af-bd3b-78f312347a94 | client-request-id : 48ef6f9c-3309-4614-9680-cc1766c73cfe. Error Component : LSR"}, no any user credential info available for authorization"

This is my code which does not work (When I use the same code with same sasToken from linked service - it is working):

from notebookutils import mssparkutils
mssparkutils.fs.mount(  
    "abfss://******@adlsasa.dfs.core.windows.net/", #ADLS GEN 2 PATH  
    "/testisko", #Mount Point Name  
    { "linkedService" : "ADLS_personal"}  
)
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,309 questions
{count} votes

Accepted answer
  1. Smaran Thoomu 22,505 Reputation points Microsoft External Staff
    2024-02-12T09:01:43.4+00:00

    Hi @Peter Michalik

    Yes, even when using a linked service, you need to provide the SAS URI in the mount command. The linked service only provides the authentication details, but the mount command still requires the complete URI to access the storage account.

    The difference between using a linked service and directly pasting the SAS token is that using a linked service allows you to manage the authentication details in a centralized location. This means that you can update the authentication details in the linked service, and all the pipelines and notebooks that use that linked service will automatically use the updated details.

    If you want to keep your SAS tokens in Key Vault, you can use Azure Key Vault-backed secrets to store the SAS token, and then reference the secret in the linked service. This way, you can keep your SAS tokens secure and still use them in your pipelines and notebooks.

    Here's an example of how to use a Key Vault-backed secret in a linked service:

    1. Create a secret in Key Vault with the SAS token value.
    2. Create a Key Vault-linked service in Azure Data Factory.
    3. In the Key Vault-linked service, reference the secret using the syntax @Microsoft.KeyVault(SecretUri=<secret-uri>). Replace <secret-uri> with the URI of the secret in Key Vault.
    4. Use the Key Vault-linked service in your pipelines and notebooks.

    I hope this helps! Let me know if you have any further questions.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.