Avoid slash in the quotes while reading parameter in pyspark

Kumar, Arun 336 Reputation points
2023-11-04T19:51:11.9566667+00:00

I am reading the a parameter value from a SQL Server table to a pyspark notebook. How can i prevent the slashes from showing up in the parameter value. Because of this the API response is failing.

The original value from SQL server is 

dict(deleteddate="null", firstproddate = f"ge({InitialWatermark})", pagesize=100000)

what is displayed on the parameter is below

"dict(deleteddate=\"null\", firstproddate = f\"ge({InitialWatermark})\", pagesize=10000)"
				

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,985 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,819 questions
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 38,451 Reputation points Microsoft Employee
    2023-11-06T06:16:19.3366667+00:00

    Hi Kumar, Arun,

    Thank you for posting query in Microsoft Q&A Platform.

    Are you using Synapse notebook here? If yes, could you please share the code and steps you are running to understand issue better. Also, kindly share where exactly you are seeing slash's.

    To me it looks like that slash's what you are seeing are only for UI purpose to escape double quotes. They are not part of value. You can try to store the data in variable and try to replace slash's with empty.

    Hope this helps. Please let me know how it goes.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.