issue while copying data from Oracle to Azure SQL using ADF Copy Activity.

man 0 Reputation points
2025-11-13T07:30:59.1533333+00:00

Hello ADF Support Team,

We are facing an issue while copying data from Oracle to Azure SQL using ADF Copy Activity. The pipeline fails or produces incorrect numeric values due to high-precision decimals in Oracle.

Details:

Source (Oracle): TABLE_FIN

Target (Azure SQL): TABLE_FIN_TEST

Oracle column types: NUMBER (in adf mapping it convert decimal 38,18 places)

Azure column types: NUMERIC(38,18)

Error encountered:

Operation on target TABLE_FIN failed:

ErrorCode=TypeConversionFailure,

Exception occurred when converting value '37.33333333333333333333333333333333333333'

for column name 'RWEEK' from type 'Decimal' (precision:256, scale:130)

to type 'Decimal' (precision:256, scale:130).

Additional info: Specified cast is not valid.

What we tried:

Created a new Azure object NNTABLE_FIN_TEST with NUMERIC(38,18) for numeric columns.

Used a ROUND(column, 18) option in the Oracle source query to limit decimal places and make the pipeline secure:

SELECT

EMPID,

EMPNAME,

DESIGNATION,

LOCATION,

ROUND(HOURSPERWEEK, 18) AS HOURSPERWEEK,

ROUND(TOTALHOURS, 18) AS TOTALHOURS,

ROUND(RATE, 18) AS RATE,

ROUND(COST, 18) AS COST,

ROUND(WEEK, 18) AS WEEK,

MANAGER,

DEPARTMENT

FROM TABLE_FIN;

result: Pipeline runs successfully, but some numeric values differ from the Oracle source. For example:

oracle ValueAzure Result3.44444445556666666666688883.00000000000000000040.2222222222222222222240.000000000000000000

row counts are also slightly different in some cases.

request for Support:

confirm if there is any supported way to preserve all Oracle decimal precision when writing to Azure NUMERIC(38,18) via ADF.

advise on best practices to avoid data mismatch while copying high-precision numeric values.

Confirm whether any ADF connector or mapping configuration can help prevent rounding/truncation.

Wwe are in production, and preserving numeric accuracy is critical. We applied the ROUND(...,18) in the source query for a secure copy, but we still see mismatched values. Please advise the safest approach.

Additional Info:

source table: TABLE_FIN

target table: TABLE_FIN_TEST

pipeline: ADF Copy Activity

Azure SQL version: [Your Azure SQL version]

Oracle version: [Your Oracle version]

Please provide the correct queries and method we need to fix it
Recently we upgrade linked service 1.0 to 2.0,except this all other pipelines are running fine

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
{count} votes

2 answers

Sort by: Most helpful
  1. Pratyush Vashistha 5,130 Reputation points Microsoft External Staff Moderator
    2025-11-19T08:39:10.2+00:00

    Hello man,

    Thank you for reaching out on the Microsoft Q&A portal and for providing such a detailed description of the issue. I understand that you're facing a critical data precision issue when copying high-precision numeric data from Oracle to Azure SQL. It's certainly a common challenge, but we can definitely work through it.

    The error TypeConversionFailure and the subsequent data mismatch, even after using ROUND, point to an issue where the data type is being misinterpreted at some point during the data movement process. The Oracle NUMBER type can have a floating precision that doesn't always map perfectly to the fixed NUMERIC(38, 18) in Azure SQL, especially as it passes through the Azure Data Factory (ADF) integration runtime.

    When ADF reads the data, its internal engine infers a very high-precision data type (as seen in your error message), which then fails to convert to the target's fixed precision and scale. Your ROUND function was a good idea, but the incorrect results suggest that a loss of precision is happening after the query is executed, likely due to an implicit cast to a less precise type like a float or double within the ADF pipeline before it reaches the sink.

    To ensure a successful and accurate data copy, we need to be more explicit about the data type before it even leaves the Oracle source.

    Explicit Casting in the Source Query

    Instead of relying on ADF to correctly interpret the rounded number, you should explicitly cast the column to the exact precision and scale that matches your Azure SQL target table.

    Modify your source query in the ADF Copy Activity to use the CAST function. This forces Oracle to format the data correctly before sending it to ADF.

    1. Update the Source Query in ADF
      • Go to your ADF Copy Activity settings.
        • In the "Source" tab, update your query to cast each numeric column.
         SELECT
      
      By using CAST(... AS NUMBER(38, 18)), you are instructing the Oracle database itself to handle the rounding and truncation. The data is delivered to ADF in the precise NUMBER(38, 18) format, which maps cleanly to Azure SQL's NUMERIC(38, 18), eliminating any ambiguity or risk of incorrect type inference by the ADF runtime.

    Here are a few other things to verify to ensure your pipeline is robust.

    • In the Mapping tab of your Copy Activity, ensure that the data types are correctly inferred. After changing the source query with the CAST function, you can click "Import schemas" again. The source type should now be recognized with the correct precision, and it should map directly to the Decimal type with a precision of 38 and a scale of 18 on the sink side.
    • If you are using a staging account or PolyBase for this copy operation, be aware that these intermediate steps have their own data type conversion rules that can sometimes cause issues. The explicit CAST in the source query is the best way to mitigate this, as it prepares the data correctly before it ever hits the staging environment. For a quick test, you could try running the copy without staging to see if the behavior changes, which would confirm if the issue lies with the staging process.

    I am confident that implementing the CAST function in your source query will resolve both the pipeline failure and the data mismatch issues you are observing.

    Thank you again for asking your question. Please let us know if this solution works for you or if you have any further questions!

    Helpful References:

    Please "Accept as Answer" or click 'Thumbs Up YES' if the answer provided is useful, so that you can help others in the community looking for remediation for similar issues.

    Thanks

    Pratyush

    User's image

    0 comments No comments

  2. Manoj Kumar Boyini 1,810 Reputation points Microsoft External Staff Moderator
    2025-11-22T10:34:48.8633333+00:00

    Hi man,

    Thanks so much for explaining the situation in detail, and for your patience while working through this! Pratyush Vashistha advice about using the CAST function in your Oracle query is spot-on it’s a great way to make sure your numbers are formatted correctly before you move them with Azure Data Factory.

    Just to help you cover all the bases and avoid any surprises, here are a few more tips:

    1. After updating your Oracle query, check the Mapping tab in ADF. Make sure your columns are lined up as NUMBER(38, 18) on the source side and NUMERIC(38, 18) on the destination.
    2. Before copying all your data, use the Data Preview in ADF to quickly spot any precision issues. This lets you catch problems before the full load.
    3. If you are using staging tables in your pipeline, those can sometimes mess with column types. Try running without staging as a test if things work better, you’ve found where the issue is.
    4. Debug mode is good Testing with a small sample helps catch any unexpected changes, especially with tricky numeric types.
    5. Keep using the CAST in your Oracle queries so the numbers are rounded and formatted just the way Azure SQL likes.

    Try out these steps and reach out if you have any other questions. Happy to help make sure your numbers come across exactly right!

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.