HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: -- Arithmetic overflow error converting expression to data type NVARCHAR.

Sreejith 30 Reputation points
2023-06-29T04:41:47.65+00:00

Hi Team,

I am trying to load data from Oracle table which has multiple CLOB columns to synapse dedicated pool table having nvarchar(max) columns and facing this error in Title. The issue happens when I am trying to read external table pointing to parquet file.

Interestingly the load works fine with Bulk Insert/Copy options with Copy activity to a stage table. Is this a known limitation with hadoop external tables. I haven't tried with native external tables as that is not in GA yet.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,375 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,625 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Vinodh247 34,661 Reputation points MVP Volunteer Moderator
    2023-06-29T05:46:39.6866667+00:00

    Hi,

    Thanks for reaching out to Microsoft Q&A.

    Looks like datatype mapping error. Can you go through the following link once

    https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/design-elt-data-loading#define-the-tables

    Please Upvote and Accept as answer if the reply was helpful, this will be benefitting the other community members who go through the same issue.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.