SQL Server Management Studio OutOfMemory on import flat file

Camille Schopmeijer 0 Reputation points
2024-11-12T16:06:44.2333333+00:00

Hello,

Throughout the years I have been working with large datasets (CSV's) in combination with SSMS and a local SQL server DB (Developer DB). A week ago I have worked on a particular CSV (6gb) on my old and trusty laptop. What I always do is use the functionality 'import flat file', as I also did on that CSV. Everything was fine.

A week later, and a fancy new laptop later, I reinstalled SSMS (newest version) and setup a local SQL DB (Developer DB). Next was to import the same CSV, so I choose 'Import flat file'. When I 'Preview data', I get 'an error occurred while running learn on the input file'. I go a couple of versions SSMS back (the closest one to my old laptop SSMS version), and I get an out of memory exception on the import process.

I know SSMS is 32-bit, but it always used to work for me on my old laptop. When I do the same task on my old laptop, SSMS (in task manager) uses around 100 - 150mb RAM consistently, like it breaks up the CSV in parts. When I use my new laptop for the same task, same CSV, it increases and increases to 2500mb RAM and then crashes. I have 32 gb RAM in my old laptop, 32 gb RAM in my new laptop.

It seems like there is a change done to the Import flat file process between SSMS versions. Looks like it causes an overflow on my new laptop, but not on the old one. Am I correct? Does this sound familiar? What can I do? I'm pretty sure it doesn't have anything to do with my DB instance, it looks like it's more on SSMS side. Could somebody point me in the right direction?

Thanks in advance.

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
14,145 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Bruce (SqlWork.com) 68,081 Reputation points
    2024-11-12T17:56:32.06+00:00

    this appears to be a common issue with SSMS. the common workaround is to use the bcp utility to do the file import. you also might try azure data studio with the file import extension:

    https://learn.microsoft.com/en-us/azure-data-studio/extensions/sql-server-import-extension

    note: I would not be surprised if the memory limit came when additional csv import features were added.

    0 comments No comments

  2. Yitzhak Khabinsky 26,201 Reputation points
    2024-11-12T20:49:14.66+00:00

    Hi @Camille Schopmeijer,

    Another option would be to try BULK INSERT statement.

    Please check a couple of conceptual samples below.

    USE tempdb;
    GO
    DROP TABLE IF EXISTS dbo.tbl;
    CREATE TABLE dbo.tbl ([State] CHAR(2), City VARCHAR(30), Established DATETIME);
    -- SQL Server 2016 and earlier
    BULK INSERT dbo.tbl
    FROM 'E:\Temp\input.csv'
    WITH (
       DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' } 
       , FIELDTERMINATOR = ','
       , ROWTERMINATOR = '\n'
       , FIRSTROW = 2 
       , LASTROW = 3 -- Specifies the number of the last row to load. The default is 0, which indicates the last row in the specified data file.
       , CODEPAGE = '65001');
    -- test
    SELECT * FROM dbo.tbl;
    
    
    -- SQL Server 2017 onwards
    BULK INSERT dbo.tbl
    FROM 'E:\Temp\StateCity.csv'
    WITH (FORMAT='CSV' -- starting from SQL Server 2017 onwards
       , FIRSTROW = 2
       --, FIELDQUOTE = '"'
       , FIELDTERMINATOR = ','
       , ROWTERMINATOR = '\n');
    -- test
    SELECT * FROM dbo.tbl;
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.