Bug in Flat File export in large datasets.

Dimitris Vayenas 21 Reputation points

Hi, I need to export the data from my SQL server in JSON format (so that they can imported to a graph database).

When I have few data, this is no problem, I can even run the query and copy paste the result. But this works only when the output file is roughly less than 4MB.

When I have large dumps of data - in some instances can be up to 100MB, I am using the export technique from the SQL server.

You can find the documentation in the answer here: https://stackoverflow.com/questions/952247/sql-server-truncation-and-8192-limitation For the case of JSONs you just have to tick the Unicode box.


Then you add your query:


And finally you chose your delimiters:


The problem is that when the size of the dump is too large, the dump is corrupted. You get odd empty lines etc. see:


or this:


At times, in very large files (over 30MB) splits/new lines appear after the opening quotes of properties/attributes etc.

Has anyone here came across this issue?

How can we get a quick resolution? Just by writing here or need to open a case will it solve the issue in a reasonable amount of time? Any other approach in exporting JSON data? Thank you!

[1]: /api/attachments/74251-jsonbug1.png?platform=QnA [2]: /api/attachments/74235-jsonbug2.png?platform=QnA [3]: /api/attachments/74214-jsonbug3.png?platform=QnA [4]: /api/attachments/74197-jsonbug4.png?platform=QnA [5]: /api/attachments/74236-jsonbug5.png?platform=QnA

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
13,339 questions
SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,523 questions
0 comments No comments
{count} votes

Accepted answer
  1. Monalv-MSFT 5,896 Reputation points

    Hi @Dimitris Vayenas ,

    SQL Server Integration Services (SSIS) has been used widely as an ETL tool. In addition, you can also use SSIS to export data to files such as CSV, tab delimited, Excel or XML. In this tip, we will export data in JSON format using SQL Server Integration Services.

    1. In this method, we will be creating a SSIS package with a dataflow. In this data flow, we will extract data using the above-mentioned T-SQL query. The resultant JSON data content will be sent to a flat file destination.
    2. This method will make use of the script task to export data in JSON format. In this method, a script task will execute the SQL query. The output of the T-SQL query will be read using C# and the result set will be written to a flat file.

    Please refer to Using SQL Server Integration Services to Export Data in JSON Format.

    Best regards,


    If the answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

3 additional answers

Sort by: Most helpful
  1. Erland Sommarskog 106.8K Reputation points

    Try this:

    DECLARE @json nvarchar(MAX)
    SELECT @json

    I have not tested this, but there are other contexts where this workaround is needed.

    0 comments No comments

  2. Dimitris Vayenas 21 Reputation points

    Thank you but...

    1. Correct the AUITO to AUTO in your post.
    2. You cannot do the declaration etc. as you do in the query editor in the export package. You get this error:


    It will be great if you chose to test first your advice before speculating.

    0 comments No comments

  3. Erland Sommarskog 106.8K Reputation points

    OK, so try this instead:

    SELECT CAST ((SELECT * FROM sys.objects FOR JSON AUTO) AS nvarchar(MAX))

    And, no, I have not tested this in the Import/Export wizard, because I never use it myself, and hardly know how to use it.

    But I know that there are issues in SqlClient, which causes data returned with FOR JSON to be truncated, and you can work around that bug by converting to a regular nvarchar. I am not sure that you are hitting that bug, but it is worth trying.

    0 comments No comments