Copy Activity - Azure Sql to Oracle

Vivek, Anand 20 Reputation points
2023-09-19T07:16:44.6433333+00:00

Hi,

I am using "Copy Activity" to load the data into ORACLE from Azure SQL.

In Azure SQL datatype is NVARCHAR(4000) and in ORACLE datatype is VARCHAR2(4000 CHAR).

in this case copy activity fails with following error. Any suggestion on how to resolve this?

Failure happened on 'Sink' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver]Data type for parameter 10 has changed since first SQLExecute call.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=Microsoft.DataTransfer.ClientLibrary.Odbc.Exceptions.OdbcException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver]Data type for parameter 10 has changed since first SQLExecute call.,Source=msora28.dll,'

Note : I tried changing the datatype in ORACLE from VARCHAR2(4000 CHAR) to NVARCHAR2(2000) but now I start getting length issue as in source column length is more than 2000.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,877 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,597 questions
0 comments No comments
{count} votes

Accepted answer
  1. Justin Hyland 80 Reputation points
    2023-11-01T18:17:00.35+00:00

    I came across this thread as part of my own research on this exact problem. I think this is an issue with the underlying ODBC Oracle Wire Protocol driver which is used by other platforms. Here is a reference of this same issue someone experienced when working with WebSphere MQ. First call succeeds (like your issue with large batch size only needing one call) and but the second call fails (when ADF trying to insert second batch since).

    [http://www.mqseries.net/phpBB/viewtopic.php?t=54590&sid=b637c28a5d1dbae97bdc89576749ad9e]

    Adding this to additional connection properties. I don't know if it is case sensitive.

    After adding this it resolved my issue.

    Property Name: WorkArounds

    Value:536870912

    Here is a reference to the WorkArounds values:

    https://docs.progress.com/bundle/datadirect-hybrid-data-pipeline-46/page/WorkArounds-and-WorkArounds2-options.html

    User's image

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Amira Bedhiafi 23,486 Reputation points
    2023-09-19T16:54:44.9866667+00:00

    I think your issue is related to data type mismatches or inconsistencies during the data transfer process. The copy process fails when you have a character not represented in VARCHAR2.

    Ensure that the Oracle database's character set supports the characters present in your Azure SQL data. Unicode characters might not be correctly represented in databases with non-Unicode character sets. For instance, AL32UTF8 is a common Unicode character set for Oracle databases.

    Also, heck if any rows in your Azure SQL data have values in the NVARCHAR(4000) column that exceed 4000 characters when represented in Oracle's character set. If yes, you'll need to trim, split, or otherwise manage these values.

    Instead of directly copying data from Azure SQL to Oracle, consider copying the data into an intermediate format like Parquet or CSV. From there, you can load the data into Oracle. This might give you more flexibility in terms of data transformation and cleansing.

    In the ADF copy activity, you can define data type mapping. Define the source type NVARCHAR to map to the Oracle VARCHAR2(4000 CHAR). Ensure that the data does not contain characters that are not supported in the Oracle VARCHAR2 datatype.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.