question

Sameer-3740 avatar image
0 Votes"
Sameer-3740 asked SandeepNidumukkala-8906 answered

Delimiter error in copy activity of ADF

I am getting a following error while moving data through ADF.

ERROR [22000] Found character '>' instead of field delimiter ',' File

  1. I tried to update the escape character in connection settings, but there is no option to update escape character

  2. I don't want to use Replace by manually adding it to each string column.

Is there any option to tackle this via setting escape character somehow, or dynamically setting it ? Thanks for your help.



azure-data-factory
· 6
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @Sameer-3740 ,


Welcome to the Microsoft Q&A platform.

Is this error occurring when you are attempting data from a source to Snowflake with staging enabled? If this not your case, can you please add additional context of the source and sink of the copy activity?


0 Votes 0 ·
Sameer-3740 avatar image Sameer-3740 SathyamoorthyVijayakumar-MSFT ·

Yes, that's correct - error is occurring when attempting data from a source to Snowflake with staging enabled. The source for the data is a SQL server.

0 Votes 0 ·

Thanks Vijay.

But It seems like the data seems be moved from SQL server to Azure Blob successfully, I am facing an issue loading data from Azure blob to snowflake.

What seems to be an issue is ADF adds () backslash as an escape character. In my string data there are two characters back to back that needs escaped. For instance , my data is -> "xyz\". So, azure adds escape characters as--> \"xyz\\\". Due to this I am getting an error loading it into snowflake. Any idea how I can address this?

0 Votes 0 ·

@Sameer-3740 - Yes that's right. So basically - what I meant in the below answer is that - the data that been successfully written to the Azure Blob has now become invalid for the SnowFlake DB. Did you happen to try the below workaround which should ideally handle the above-mentioned scenario?

0 Votes 0 ·
Sameer-3740 avatar image Sameer-3740 SathyamoorthyVijayakumar-MSFT ·

Yes @SathyamoorthyVijayakumar-MSFT . That works. Thank you for the resolution.

0 Votes 0 ·

I tried adding "escapeQuoteEscaping":true in my pipeline for copying data from blob to snowflake, but it's not working for me. Even after defining some other escape character such as '^' or '$' or etc. It is still using the '\' as the escape character.
Quite weird issue from blob to snowflake.

0 Votes 0 ·
SathyamoorthyVijayakumar-MSFT avatar image
0 Votes"
SathyamoorthyVijayakumar-MSFT answered SathyamoorthyVijayakumar-MSFT edited

Hello @Sameer-3740 ,

The issue could be because of the first stage (Source -> staging blob) use csv format and the format serializer failed to escape the escape char in the data eventually causing the data to be invalid.

You could try the below workaround at your end.

Manually edit JSON payload of your pipeline, go to properties -> activities -> {your copy activity} -> "typeProperties", add a flag "escapeQuoteEscaping": true

134155-image.png

Please Note : If using SHIR, please make sure the SHIR version must >= 5.5.7762.1 to get this fix

Hope this will help. Please let us know if any further queries.



  • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification

  • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators



image.png (13.8 KiB)
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

SinghAnurag-7616 avatar image
0 Votes"
SinghAnurag-7616 answered

I tried adding "escapeQuoteEscaping":true in my pipeline for copying data from blob to snowflake, but it's not working for me. Even after defining some other escape character such as '^' or '$' or etc. It is still using the '\' as the escape character.
Quite weird issue from blob to snowflake.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

JyothsnaNagarajugari-4641 avatar image
0 Votes"
JyothsnaNagarajugari-4641 answered

Hi Team,

We are trying to move the data from SFTP to snowflake facing error:
ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [22000] Found character '1' instead of field delimiter ', I have tried the work around by adding "escapeQuoteEscaping": true but still issue is not resolved.Please suggest

Note: we are using Auto IR here.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

AnilDangol-4469 avatar image
0 Votes"
AnilDangol-4469 answered

I am having the same issue. somehow ADF doesn't respect ESCAPE property of file format.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

SandeepNidumukkala-8906 avatar image
0 Votes"
SandeepNidumukkala-8906 answered

i am having the same issue i tried the answer but didnt solved the error . how to clear the error pls help

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.