Azure Synapse - Publishing Error because of special characters

Sandro Falter 31 Reputation points
2021-11-05T04:01:57.303+00:00

I am currently receiving a publishing error when I try to publish my changes in Azure Synapse:

Error
Notifications
 Error code: OK
Inner error code: BadRequest
Message: The document create or update failed because of invalid json, please ensure that special characters (like $) are not present in payload JSON properties.

It seems to be an error related to special characters. However I do not know, where I should start to search for the error as it can be anywhere in my workspace.

Do you have any suggestions how to localize the problem?

Where are special characters allowed, where are they forbidden?
For example the SQL Scripts within the workspace contain many '$' for accessing the data in a Json Document. Similar to Spark Notebooks where special characters are occasionally needed for Regex functions.

As Synapse store the configuration of these resources in Git and creates the ARM template based on the the configuration files later, I am confused, where special characters like '$' can be used and where it is not allowed? What are the JSON payload properties of the Synapse Workspace? Only the content of the ARM template or all the files of the synapse configuration that are synced with git?

Thank you very much for your help.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
{count} votes

2 answers

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,551 Reputation points Microsoft Employee Moderator
    2021-11-05T06:47:16.86+00:00

    Hi @Sandro Falter ,

    Thank you for posting query in Microsoft Q&A Platform.

    I tried to reproduce your scenario by having special character($) in SQL script, Notebook, Pipeline activity & Dataset. But not able to get your error. For me even if I had special character($) its publishing fine with out any issues.

    Could you please check apart from above mentioned components in any areas you might have possibly used special character?
    You can also try to perform search in repo and see all the files in which special character($) is there?

    Kindly try and share your findings for better understanding and repro the issue.

    1 person found this answer helpful.

  2. Sandro Falter 31 Reputation points
    2021-11-10T01:26:50.273+00:00

    Hello, thank you for your answer.

    I have solved the issue by just deleting pipeline over pipeline until the publishing was possible again. I assume the problem was rooted in one of the pipelines having a $ somewhere. (so no code in a spark notebook was responsible for the error).

    Do you have any advice how to approach this problem in general. In a production environment I might not be able to delete all my pipeline until the error is gone.

    So is is generally a good advice to just serach the repo for special characters?


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.