Getting error: Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated in azure synapse

Anonymous
2022-06-29T23:43:46.883+00:00

Hi All, I need assistance here, I am loading data from ADLS to Dedicated SQL Pool. Data in ADLS is multiple JSON files for which I am using wildcard in the data flow activity in my azure synapse pipeline . ![216402-image.png][1] At the target level I am loading the data but every time job is running and ended up giving below error "Job failed due to reason: at Sink 'sinkJob': shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated. Any suggestions! Thanks in Advance. [1]: /api/attachments/216402-image.png?platform=QnA

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,469 questions
Azure Data Explorer
Azure Data Explorer
An Azure data analytics service for real-time analysis on large volumes of data streaming from sources including applications, websites, and internet of things devices.
547 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,186 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,331 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,246 questions
0 comments No comments
{count} vote

Accepted answer
  1. HimanshuSinha-msft 19,476 Reputation points Microsoft Employee
    2022-07-01T22:41:45.347+00:00

    Hello anonymous user,
    Thanks for the question and using MS Q&A platform.
    As we understand the ask here is to rid of the error , please do let us know if its not accurate.
    Looking at the mapping data flow above , it seems like that you are reading some files and then on the sink side you are inserting them to dedicated sql . The error which you shared means that the column to which you are inserting the data has got a column width wich is less then the data which is coming in . Well had it been copy activity you could have used the fault tolenace option to over this . I think you should focus on the varchar columns and see if you can either increase column width ( eg varchar(100) to varchar(150) ) or in the derived column activity use the substring function to reduce the data size .

    Please do let me if you have any queries.
    Thanks
    Himanshu


    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
      • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators

1 additional answer

Sort by: Most helpful
  1. Mutthuluru Yashwanth Sai 0 Reputation points
    2023-01-24T03:27:57.3833333+00:00

    Hi,

    Even I was getting the same error.

    PolybaseOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse. Operation: 'Polybase operation'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated.,},],

    While using Bulk Insert, Im able to load the data but it was getting failed with polybase. Can someone help ?

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.