Data conversion failed

Nicky 116 Reputation points
2020-10-22T08:35:25.407+00:00

Hi

I am exporting data from Sql server to Csv. My table has special characters which are important to retrieve but SSIS task throw error while exporting data. Characters like ' ồ ' and many more in different columns.

Datatype in SQL Server for this column is nvarchar(50) and in flat file I am using unicode string [DT_WSTR]. If I check Unicode in General tab data gets exported by I get a comma delimited file. When i do not check Unicode checkbox my csv gives perfect data but fails at these special characters.

I am using 1252 ANSI-Latin 1
Error: [Flat File Destination 1 [34]] Error: Data conversion failed. The data conversion for column "Column10" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

Thanks !

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
13,948 questions
SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,589 questions
Transact-SQL
Transact-SQL
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,651 questions
{count} votes

Accepted answer
  1. Nicky 116 Reputation points
    2020-10-23T15:30:57.693+00:00

    anonymous user : If I do that I will get error 'column and text can't be same' as you suggested text to be '' .


9 additional answers

Sort by: Most helpful
  1. Stefan Hoffmann 621 Reputation points
    2020-10-22T08:48:59.863+00:00

    Export to UTF-8 instead. This avoids a lot of troubles.. and consider using XML.

    For your error: It is unavoidable. As you try to export data, which has no representation in your required destination format. Thus any way to make it run will lead to data loss.

    0 comments No comments

  2. Monalv-MSFT 5,901 Reputation points
    2020-10-22T09:48:49.223+00:00

    Hi @InderjeetKaur-3272 ,

    If I check Unicode in General tab data gets exported by I get a comma delimited file.

    When we use unicode string [DT_WSTR], we should check Unicode in the flat file connection manager.

    Please refer to Unicode And Non-Unicode.

    Best Regards,
    Mona

    ----------

    If the answer is helpful, please click "Accept Answer" and upvote it.
    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments

  3. Nicky 116 Reputation points
    2020-10-22T10:08:43.363+00:00

    My main motive is to export this data to sharepoint without using third party tool. I was able to achieve this with xls file but xls changes all numeric data to text.

    So i considred using csv file which fails for special characters. Is there a way I can show that data in seperate column(not comma delimited) when I check Unicode?

    I export data to csv on local machine and wrote some C# code to upload this csv to Sharepoint.

    Thanks !

    0 comments No comments

  4. Ben Miller (DBAduck) 956 Reputation points
    2020-10-22T14:31:14.66+00:00

    In order to get Unicode characters out of NVARCHAR columns into a CSV, you need to do 1 of 2 things.

    1. Check the Unicode box and use DT_WSTR types in your columns in the Flat File Connection columns
    2. use the 65001 Code page without checking the Unicode box and use DT_WSTR in the flat file connection columns

    This will enable to the characters to be output in csv files. The file output will be a Unicode file and if opened in a text editor that can read Unicode (notepad++) you will see the special characters.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.