Data Factory Copy - SQL Azure Timeout

Mike McGuinness 26 Reputation points
2022-02-01T18:09:56.543+00:00

I have a Copy Data task in a pipeline that keeps timing out. The task is supposed to copy 399,113 rows from a CSV file to an Azure SQL table. Works fine using SSIS.

The ADF sink fails every time after about 2:12 or 2:13 minutes into the "Writing to sink" step.

I increased the Azure SQL Database DTUs from 20 to 50, but it had zero effect. 170288-adf-copy-timeout.png

Operation on target Import AVS SEC Pricing from CSV failed: ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please search error to get more details.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. The statement has been terminated.,Source=.Net SqlClient Data Provider,SqlErrorNumber=-2,Class=11,ErrorCode=-2146232060,State=0,Errors=[{Class=11,Number=-2,State=0,Message=Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.,},{Class=0,Number=3621,State=0,Message=The statement has been terminated.,},],''Type=System.ComponentModel.Win32Exception,Message=The wait operation timed out,Source=,'

Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
0 comments No comments
{count} votes

12 answers

Sort by: Most helpful
  1. Darling, Alex 11 Reputation points
    2022-02-10T10:00:25.167+00:00

    I am also getting the same issue. Insert works / Upsert does not. The only thing I've noticed is that the key column type is wrong. My dataset source AND sink are both integer, but shows abc in the key list?? The dynamic content is also disabled, so can't wrap with toInteger(ID)
    I've tried all different methods but no joy in getting this to work.

    My only solution is to use a Data Flow with source > Alter Row> sink. (This does work)
    173085-image.png

    Please can this be looked into

    2 people found this answer helpful.

  2. Jason Bell 16 Reputation points
    2022-03-24T02:45:00.187+00:00

    I also had the same issue. I found that my issue was related to my Azure SQL Sizing. I was running a cheap 10DTU Azure SQL and it kept timing out. I bumped it to 100 DTU and the import worked just fine. I will be trying different DTU sizes to see if I can find the lowest DTU for my use case.

    186298-image.png

    2 people found this answer helpful.

  3. MartinJaffer-MSFT 26,236 Reputation points
    2022-02-02T19:39:07.71+00:00

    Hello again @Mike McGuinness . Thanks for making this a separate thread.

    Reiterating from my parting comments in previous question thread, this error occurs when a writebatch takes too long to write.
    However, looking at your copy details, I am more concerned that NO data was written. As a test to determine whether this is due to writebatch or something else, I'd like you to try setting write batch size to 1. If it can't write a single row, then something else is deeply wrong.

    Also, just to rule out the unlikely, how big is a single row of data?

    I see there are "interim rows" in your copy activity details. This suggests polybase or staging. Can you tell me what sort of settings you have on that?

    1 person found this answer helpful.

  4. Oliver Grant Adams 6 Reputation points
    2022-04-07T17:41:05.403+00:00

    This is probably not the answer you are hoping for but I now set these pipelines up as Oracle to Parquet on blob then truncate the temp table in azure sql myself using a pre copy command followed by Parquet to Azure SQL. When the copying is finished I run a stored procedure which merges the temp and prod tables.

    This has worked perfectly everytime for the last few weeks.

    1 person found this answer helpful.
    0 comments No comments

  5. Cloud_Knight 1 Reputation point
    2022-02-11T12:12:39.223+00:00

    Any luck here? Even I'm getting the same issue when I tried to upsert data using copy activity.

    ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please search error to get more details.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
    The statement has been terminated.,Source=.Net SqlClient Data Provider,SqlErrorNumber=-2,Class=11,ErrorCode=-2146232060,State=0,Errors=[{Class=11,Number=-2,State=0,Message=Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.,},{Class=0,Number=3621,State=0,Message=The statement has been terminated.,},],''Type=System.ComponentModel.Win32Exception,Message=The wait operation timed out,Source=,'

    I tried below options

    • Unchecking the Use Tempdb option to "Select user DB schema" to dbo schema.
    • Set "Write batch size" to 10000 rows
    • Set "Write batch timeout" to 00:30:00 (30 mins)

    None of the options worked. :-(

    Also, I'm not able to add Dynamic Content for Key columns textbox. I tried this to pass keys dynamically from parameters unfortunately no parameters were showing up in the dynamic content for Key Columns

    Can you suggest???

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.