Azure Data Factory - Copy Data TAsk - Upsert stopped working - Was working for Weeks now.

Alessandro 87 Reputation points
2022-06-13T15:18:50.857+00:00

210896-image.png

ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'Violation of PRIMARY KEY constraint 'PK_XXXX'. Cannot insert duplicate key in object 'dbo.XXXX'. The duplicate key value is (5574449003).
The statement has been terminated.',Source=,''Type=System.Data.SqlClient.SqlException,Message=Violation of PRIMARY KEY constraint 'PK_XXXX'. Cannot insert duplicate key in object 'dbo.XXXX'. The duplicate key value is (5574449003).
The statement has been terminated.,Source=.Net SqlClient Data Provider,SqlErrorNumber=2627,Class=14,ErrorCode=-2146232060,State=1,Errors=[{Class=14,Number=2627,State=1,Message=Violation of PRIMARY KEY constraint 'PK_SparePart'. Cannot insert duplicate key in object 'dbo.XXXX'. The duplicate key value is (5574449003).,},{Class=0,Number=3621,State=0,Message=The statement has been terminated.,},],'

210860-image.png

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,694 questions
{count} vote

Accepted answer
  1. KranthiPakala-MSFT 46,487 Reputation points Microsoft Employee
    2022-06-14T23:40:52.84+00:00

    Hello @Anonymous ,

    Thanks for the question and using MS Q&A platform.

    From the above error message my understanding is that this issue might be related to a data issue. The error message says that you are trying to insert a primary key column value that is already existing in the sink data store hence throwing a primary key violation error.

    You may have to validate your source and sink data and check for the value 5574449003 for your primary key and remove the existing record from sink data store might help to avoid the data insertion issue. Or as this is a data related issue, another way to capture the bad records is by logging the incompatible rows by enabling the fault tolerance with logging. This would help you to validate the data against the error message and also avoid your pipeline failure.

    Hope this will help. Please let us know if any further queries or if you think issue is not related to data and something unusual, we will be happy to dig deeper.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.