Unable to write the data into the table in dwh from databricks

MadhuVamsi-2459 32 Reputation points
2022-08-29T07:18:05.17+00:00

Hi team.,

I have created an internal table with the same data types as of external table , for external table , I am able to load the data with out any issue , but when I declare the internal table with the same length and type and when pushing the data into the dwh internal table from databricks getting the following error -

com.databricks.spark.sqldw.SqlDWSideException: Azure Synapse Analytics failed to execute the JDBC query produced by the connector.

nderlying SQLException(s):   - com.microsoft.sqlserver.jdbc.SQLServerException: HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated. [ErrorCode = 107090] [SQLState = S0001]

Please do let me know .

Thanks

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,910 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Suba Balaji 11,186 Reputation points
    2022-08-29T08:57:49.287+00:00

    Hi @Anonymous ,

    Thanks for posting your query on Microsoft Q&A platform.

    By looking at the error, seems like source is bringing more characters than the allowed limit in sink.

    Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated. [ErrorCode = 107090] [SQLState = S0001]

    Did you check your external table if all data got loaded fine or some were excluded because of truncation issue? Basically, did you take count of records ?

    Please let us know.