Issue accessing parameterized linked service within synapse notebook

DivyaK-4075 11 Reputation points
2023-05-05T16:05:12.9666667+00:00

Hello,

I have parameterized linked service for storage, I am trying to access this within synapse notebook as suggested in below link - https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-secure-credentials-with-tokenlibrary?pivots=programming-language-python#adls-gen2-storage-with-linked-services

%%pyspark # Python code 
source_full_storage_account_name = "teststorage.dfs.core.windows.net" spark.conf.set(f"spark.storage.synapse.{source_full_storage_account_name}.linkedServiceName", "<LINKED SERVICE NAME>") 
spark.conf.set(f"fs.azure.account.oauth.provider.type.{source_full_storage_account_name}", "com.microsoft.azure.synapse.tokenlibrary.LinkedServiceBasedTokenProvider") 
df = spark.read.csv('abfss://<CONTAINER>@<ACCOUNT>.dfs.core.windows.net/<DIRECTORY PATH>') 
df.show()

I get below error:

Py4JJavaError: An error occurred while calling o3897.csv. : HTTP Error -1CustomTokenProvider getAccessToken threw java.io.IOException : POST failed with 'Bad Request' (400) and message: {"result":"DependencyError","errorId":"BadRequest","errorMessage":"[Code=, Target=, Message=]. TraceId : 9c989aa9-6bb3-4aea-9dbc-42814da06672 | client-request-id : 1990387d-78d0-4615-8c8a-7af2179ef5fe. Error Component : LSR"}org.apache.hadoop.fs.azurebfs.oauth2.AzureADAuthenticator$HttpException: HTTP Error -1CustomTokenProvider getAccessToken threw java.io.IOException : POST failed with 'Bad Request' (400) and message: {"result":"DependencyError","errorId":"BadRequest","errorMessage":"[Code=, Target=, Message=]. TraceId : 9c989aa9-6bb3-4aea-9dbc-42814da06672 | client-request-id : 1990387d-78d0-4615-8c8a-7af2179ef5fe. Error Component : LSR"}

However, it works fine when I hardcode the storage URL within linked service (i.e without parameterization).

Please can you help on this

Thanks,

Divya

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,558 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. BhargavaGunnam-MSFT 28,271 Reputation points Microsoft Employee
    2023-05-08T20:52:10.17+00:00

    Hello DivyaK-4075,

    Welcome to the MS Q&A platform.

    Synapse product team has confirmed that the token library doesn't support parameterized linked services.

    So the error message you are seeing is by design.

    I have submitted the below feedback item, which would be open for the user community to upvote & comment on. This allows our product teams to prioritize your request against our existing feature backlog effectively and gives insight into the potential impact of implementing the suggested feature.

    https://feedback.azure.com/d365community/idea/53eef0e8-e1ed-ed11-a81c-000d3a7a9cdb

    I hope this helps. Please let me know if you have any further questions.


  2. DivyaK-4075 11 Reputation points
    2023-05-13T03:31:30.1166667+00:00

    Thanks @BhargavaGunnam-MSFT

    Is there any alternative to access parameterized linked service via synapse notebook? As per our solution, we have multiple storage accounts and want to use parameterized linked service, else we would end up creating multiple linked service per storage account.

    Thanks,

    Divya


  3. Zhenya Gornov 0 Reputation points Microsoft Employee
    2023-05-15T23:35:19.4166667+00:00

    Hey folks,

    can help me to debug my very similar issue.

    I'm getting this exception even for non-parametrized service.

    Here is my code:

    source_full_storage_account_name = "evgornovtest3.dfs.core.windows.net"
    spark.conf.set(f"spark.storage.synapse.{source_full_storage_account_name}.linkedServiceName", "AzureDataLakeStorage1")
    spark.conf.set(f"fs.azure.account.oauth.provider.type.{source_full_storage_account_name}", "com.microsoft.azure.synapse.tokenlibrary.LinkedServiceBasedTokenProvider")
    
    spark.read.format("csv").load("abfss://testtest@evgornovtest3.dfs.core.windows.net/");