Integration services package fails with error "Unicode data is odd byte size for column ... Should be an even byte size"

Greg Booth 1,296 Reputation points
2023-05-10T22:27:24.4066667+00:00

My ssis package is using an import column tranformation to read from files containing json ( Unicode) to store the json in a column.

The import column transformation has an 1 output column - called "filecontents"- that is of type Unicode text stream [DT_NTEXT].

The end of the data flow is an OLEDEB destination with the "filecontents" column mapping to a destination column in a sql table ( in a sql 2016 database) of type NTEXT.

When i run the package it gives the following error:

[OLE DB Destination [12]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.

An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Unicode data is odd byte size for column 12. Should be even byte size.".

How do i resolve this error ?

The JSON in each file is more than 8000 bytes.

The source json files are created by a powershell script that grabs the json from an API and writes it to a file using OUT-FILE without specfying and encoding.

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,493 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Vahid Ghafarpour 20,030 Reputation points
    2023-07-27T21:48:53.8066667+00:00

    Did you try the Data Conversion Transformation in your SSIS package after the Import Column Transformation? This transformation aims to ensure that the data is converted to the appropriate data type before loading it into the OLE DB Destination.

    0 comments No comments

  2. Yitzhak Khabinsky 25,481 Reputation points
    2023-07-27T21:59:42.05+00:00

    Hi @Greg Booth,

    The NTEXT data type is deprecated.

    I would suggest using NVARCHAR(MAX) data type to store JSON.

    0 comments No comments