Avoid slash in the quotes while reading parameter in pyspark

Kumar, Arun 236 Reputation points

I am reading the a parameter value from a SQL Server table to a pyspark notebook. How can i prevent the slashes from showing up in the parameter value. Because of this the API response is failing.

The original value from SQL server is 

dict(deleteddate="null", firstproddate = f"ge({InitialWatermark})", pagesize=100000)

what is displayed on the parameter is below

"dict(deleteddate=\"null\", firstproddate = f\"ge({InitialWatermark})\", pagesize=10000)"

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
3,827 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
8,526 questions
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 35,796 Reputation points Microsoft Employee

    Hi Kumar, Arun,

    Thank you for posting query in Microsoft Q&A Platform.

    Are you using Synapse notebook here? If yes, could you please share the code and steps you are running to understand issue better. Also, kindly share where exactly you are seeing slash's.

    To me it looks like that slash's what you are seeing are only for UI purpose to escape double quotes. They are not part of value. You can try to store the data in variable and try to replace slash's with empty.

    Hope this helps. Please let me know how it goes.

0 additional answers

Sort by: Most helpful