Share via

How to handle INT64 values from Parquet when copying to a DateTime column in Azure Synapse using Copy Activity?

HARISH LOGANATHAN 20 Reputation points
2026-02-28T05:57:04.1166667+00:00

I have a Parquet file stored in ADLS. One of the columns in the Parquet file is stored as INT64.

When I try to copy this data from ADLS into an Azure Synapse table using Azure Synapse Analytics Copy Activity, I get the following error:

"errorCode": "2200",
"message": "ErrorCode=TypeConversionNotSupported Data Types,
Exception occurred when converting value '1768842423000000'
for column name 'UP_DATE' from type 'Int64' (precision:-1, scale:-1)
to type 'DateTime' (precision:255, scale:7). Additional info:",
"failureType": "UserError",
"target": "COPY CDZ to DDZ",
"details": []


The target Synapse table column is of type DateTime, but the Parquet file contains the value as Int64.

Is there any configuration in Copy Activity that can automatically convert this value into a proper DateTime format, or any recommended workaround?

Azure Synapse Analytics
Azure Synapse Analytics

An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.

0 comments No comments
{count} votes

Answer accepted by question author
  1. Smaran Thoomu 33,920 Reputation points Microsoft External Staff Moderator
    2026-03-02T09:44:08.3933333+00:00

    Hi @HARISH LOGANATHAN
    The error happens because Synapse Copy Activity cannot automatically convert an INT64 value into a DateTime column.

    The value 1768842423000000 looks like a Unix timestamp stored in microseconds. Copy Activity does not automatically interpret that as a date.

    There is no direct setting in Copy Activity to convert INT64 → DateTime automatically.

    Here are the recommended options:

    Option 1 – Use Mapping Data Flow

    Instead of Copy Activity, use a Mapping Data Flow and add a derived column with conversion logic like:

    • If value is in microseconds → divide by 1,000,000
    • Convert it using toTimestamp()

    Example logic:

    toTimestamp(UP_DATE / 1000000)
    

    This allows proper conversion before writing to Synapse.

    Option 2 – Stage as BIGINT first

    1. Create the target column in Synapse as BIGINT.
    2. Load the data as-is.
    3. Use a SQL update statement to convert it:

    Example:

    UPDATE table
    

    Then change the column type if needed.

    Option 3 – Convert before landing

    If possible, convert the column to proper timestamp format before writing the Parquet file.

    Hope this helps. Let me know if you have any questions.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.