Truncated digits for numerical values in Azure Data Factory pipelines

Stanislav Slovik - Administrator 20 Reputation points
2025-05-27T08:00:14.6866667+00:00

Hello,
when I am receiving numerical values within json response from API endpoint, Azure Data Factory is applying scientific notations into numerical values. But in the same time, it's also truncating last digits, see the following example:
*Value exposed from API endpoint as part of json response:
0.000016268687991399651977414

But ADF will interpret is as the following:
1.626868799139965E-5*

So, we are able to format it into full decimal number properly, but last digits ( 1977414 ) are truncated, because value 1.626868799139965E-5 does not contain them anymore.

Now, I've been reading couple of articles, how to deal with this kind of issue:

  1. Changing API endpoint to return those values as strings (unfortunately, not possible for us)
  2. Using Data flows in ADF pipelines (might be an option, but it overcomplicates whole pipeline and is adding unnecessary complexity)

It's actually fine for us to have scientific notations within numbers, but could ADF still keep all the numbers. Are there some settings in ADF for length of numerical values:
not having only 1.626868799139965E-5
but rather 1.6268687991399651977414E-5

thanks in advance for your suggestions

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,608 questions
{count} votes

Accepted answer
  1. Dileep Raj Narayan Thumula 255 Reputation points Microsoft External Staff Moderator
    2025-06-03T08:46:03.5333333+00:00

    Hello @Stanislav Slovik - Administrator
    My observations are
    This behavior is a limitation of the Copy Activity. To preserve the full precision of numeric values from the REST API, you can use a Mapping Data Flow instead. It retains the original value from the source without expanding or truncating the decimal, as shown below in the source data.

    Reference: Value mismatch using copy data activity in ADF

    User's image

    Based on this, if the decimal values have a precision greater than 28 (BigDecimals), you should convert the column's data type to a string using the toString(<ColumnName>) expression in a Derived Column transformation, as demonstrated below:

    User's image

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Stanislav Slovik - Administrator 20 Reputation points
    2025-06-04T07:25:56.87+00:00

    hi @Dileep Raj Narayan Thumula , thanks for your suggested solution.
    So, Data Flow is currently the only activity type in Azure Data Factory, where numerical precision coming from API endpoint might be preserved, when column type is converted to string.
    It's also important to mention, that Data Flow is supported only for Azure runtime, not for Self Hosted Integration runtime.
    Meaning, that if you have some internal resources (API endpoints), which you would like to access through Data Flow, then you need to use Azure runtime (Managed Virtual Network subtype) and configure VNet, subnet and Private Endpoint.
    Please confirm, thanks


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.