Synapse Analytics Copy Activity Sink dedicated SQL Pool - Pipeline Fails when Fault tolerance enable logging in checked

RIZA ALI 0 Reputation points
2024-02-06T08:17:35.9066667+00:00

Hi All, I am getting below error when trying to copy data from Azure BLOB to Dedicated SQL pool using Synapse Analytics pipeline using Copy Activity using Copy Command in sink side. This error is coming when I enabled the Logging option under enable logging for the Fault Tolerance in Copy Activity, But it runs successfully when I disabled the Logging option in fault Tolerance. "errorCode": "2200", "message": "ErrorCode=UserErrorSqlDWCopyCommandError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL DW Copy Command operation failed with error 'Referenced external data source "(null)" not found.',Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Referenced external data source "(null)" not found.,Source=.Net SqlClient Data Provider,SqlErrorNumber=12703,Class=16,ErrorCode=-2146232060,State=2,Errors=[{Class=16,Number=12703,State=2,Message=Referenced external data source "(null)" not found.,},],'",

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,696 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Bhargava-MSFT 29,266 Reputation points Microsoft Employee
    2024-02-06T18:52:27.48+00:00

    Hello RiZA ALI, The error message generally indicates that the external data source referenced in your SQL Data Warehouse is not found. This can happen if the external data source is not configured correctly.

    Also, can you please check the below supported scenarios for the fault tolerence:

    https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-fault-tolerance#supported-scenarios

    Copy activity supports three scenarios for detecting, skipping, and logging incompatible tabular data:

    Incompatibility between the source data type and the sink native type.

    For example: Copy data from a CSV file in Blob storage to a SQL database with a schema definition that contains three INT type columns. The CSV file rows that contain numeric data, such as 123,456,789 are copied successfully to the sink store. However, the rows that contain non-numeric values, such as 123,456, abc are detected as incompatible and are skipped.

    Mismatch in the number of columns between the source and the sink.

    For example: Copy data from a CSV file in Blob storage to a SQL database with a schema definition that contains six columns. The CSV file rows that contain six columns are copied successfully to the sink store. The CSV file rows that contain more than six columns are detected as incompatible and are skipped.

    Primary key violation when writing to SQL Server/Azure SQL Database/Azure Cosmos DB.

    For example: Copy data from a SQL server to a SQL database. A primary key is defined in the sink SQL database, but no such primary key is defined in the source SQL server. The duplicated rows that exist in the source cannot be copied to the sink. Copy activity copies only the first row of the source data into the sink. The subsequent source rows that contain the duplicated primary key value are detected as incompatible and are skipped.

    0 comments No comments