Hello ,
Thanks for the question and also for using this forum.
I did tried to repro the issue on my side . This is what i did , created a source table and inserted the dummy data which you suggested , created a pipeline with copy activity with a csv as the sink . Attached one more copy activity and the idea was to copy the records in the sink file to SQL to destination table . I was expecting it to fail as you mentioned . Unfortunately i was unable to repro and it successed just fine .
DROP TABLE TEST10052020
CREATE TABLE TEST10052020
(
COL VARCHAR(MAX),COL1 VARCHAR(MAX),COL2 VARCHAR(MAX)
)
DROP TABLE TESTDistination
CREATE TABLE TESTDestination
(
COL VARCHAR(MAX),COL1 VARCHAR(MAX),COL2 VARCHAR(MAX)
)
insert into TEST10052020 values ('"this is my test\"','sometest','"\"')
select COL,COL1,COL2 from TEST10052020
select COL,COL1,COL2 from TESTDestination
When you say that "can't be interpreted by other systems to load" , what kind system are you refering to ? At least with this test , i think data factory is handling this pretty well .
One more suggestion is may be you can explore the power of TSQL ( since I am not aware of the data size and other parameters so may be it does not work out for you ) . When you configure the source you can opt from TSQL and clean the data , something like ...
SELECT REPLACE(COL,'\',' ')
,COL1
,REPLACE(COL2,'\',' ')
FROM TEST10052020
Thanks & stay safe
Himanshu
Please do consider to click on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members