C# TextFileParse System.OutOfMemoryException was thrown

MyleneChalut 136 Reputation points
2022-07-18T18:25:37.277+00:00

Hello guys !

I am trying to import a large file (17 000 000 records) in SQL Server table (2016) using a SSIS package. Everything was working perfectly with a subset of the file, but now that I am trying to load the large file, I get a System.OutOfMemoryException error. There is no way that the server doesn't have enough memory !

Here is my code:

 public void Main()  
        {  
            var mFilepath = Dts.Variables["InputFile"].Value.ToString();  
            var mSQLTable = "[Staging].[tblLoad" + Dts.Variables["LoadOption"].Value.ToString() +"]";  
            string line;  
  
           string connectionString = Dts.Connections["OLEDB_CONN"].ConnectionString;  
                   connectionString = connectionString.Trim(';');  
               var connStrDictionary = connectionString.Split(';').Select(x => x.Split('=')).ToDictionary(x => x[0], x => x[1]);  
                   connectionString = "Data Source=" + connStrDictionary["Data Source"] + ";Initial Catalog=" + connStrDictionary["Initial Catalog"] + ";Integrated Security=" + connStrDictionary["Integrated Security"];  
  
  
            try  
            {  
                DataTable dt = new DataTable();  
                **string contents = File.ReadAllText(mFilepath);  
  
                TextFieldParser parser = new TextFieldParser(new StringReader(contents));  
  
                parser.HasFieldsEnclosedInQuotes = true;  
                parser.SetDelimiters(",");  
  
                string[] fields;  
  
                while (!parser.EndOfData)  
                {  
                    fields = parser.ReadFields();  
  
                    if (dt.Columns.Count == 0)  
                    {  
                        foreach (string field in fields)  
                        {  
                            dt.Columns.Add(new DataColumn(string.IsNullOrWhiteSpace(field.Trim('\"')) ? null : field.Trim('\"'), typeof(string)));  
                        }  
                    }  
                    else  
                    {  
                        dt.Rows.Add(fields.Select(item => string.IsNullOrWhiteSpace(item.Trim('\"')) ? null : item.Trim('\"')).ToArray());  
                    }  
  
  
                }  
                parser.Close();**  
  
  
  
                var columnNames = new List<string>();  
                using (var cn = new SqlConnection() { ConnectionString = connectionString })  
                {  
                    using (var cmd = new SqlCommand() { Connection = cn })  
                    {  
                        cmd.CommandText = Dts.Variables["StagingTableGetColumnsScript"].Value.ToString();  
  
                        cn.Open();  
  
                        var reader = cmd.ExecuteReader();  
  
                        while (reader.Read())  
                        {  
                            columnNames.Add(reader.GetString(0));  
                        }  
                        cn.Close();  
                    }  
                }  
  
  
                using (SqlConnection con = new SqlConnection(connectionString))  
                {  
  
                    using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))  
                    {  
                        sqlBulkCopy.DestinationTableName = mSQLTable;  
                        sqlBulkCopy.ColumnMappings.Clear();  
                        con.Open();  
  
                        foreach (var column in columnNames)  
                        {  
                            sqlBulkCopy.ColumnMappings.Add(column.ToString(), column.ToString());  
                        }  
  
                        sqlBulkCopy.WriteToServer(dt);  
                    }  
  
                    // Perform an initial count on the destination staging table  
                    SqlCommand countRows = new SqlCommand(Dts.Variables["StagingTableGetRowCount"].Value.ToString(), con);  
                    Dts.Variables["ValidRowCount"].Value = System.Convert.ToInt32(countRows.ExecuteScalar());  
  
                    con.Close();  
                }  
  
  
                Dts.TaskResult = (int)ScriptResults.Success;  
  
            }  
            catch (Exception ex)  
            {  
                Dts.Events.FireError(0, "Something went wrong ", ex.ToString(), string.Empty, 0);  
                Dts.TaskResult = (int)ScriptResults.Failure;  
            }  
        }  

The bold section is where the error occurs. Anyone has an idea of what the issue might be ?

Thanks in advance for your help :-)
Mylene

C#
C#
An object-oriented and type-safe programming language that has its roots in the C family of languages and includes support for component-oriented programming.
10,648 questions
{count} votes

4 answers

Sort by: Most helpful
  1. Jaliya Udagedara 2,821 Reputation points MVP
    2022-07-19T02:38:19.093+00:00

    In this case, I would suggest reading line by line and process.

       using StreamReader streamReader = new StreamReader(mFilepath);  
       string line;  
       while ((line = streamReader.ReadLine()) != null)  
       {  
             // process the line  
       }  
    
    0 comments No comments

  2. Viorel 114.7K Reputation points
    2022-07-19T03:20:24.29+00:00

    Try replacing

    string contents = File.ReadAllText(mFilepath);
    TextFieldParser parser = new TextFieldParser(new StringReader(contents));

    with:

    TextFieldParser parser = new TextFieldParser(mFilepath);

    Also check if there are some special features in SSIS for importing files.


  3. AgaveJoe 27,696 Reputation points
    2022-07-19T15:23:51.267+00:00

    I'm not sure why you are manfully reading the CSV file. Import the data using standard database tools. For example use the import feature in SSMS if this is a one off import. For ETL processes uses Integration Services, BCP, or Bulk Insert.

    I use BCP a lot and I believe the SSMS import chunks the data.

    Otherwise, you might have to chunk the data yourself.

    Reference docs.
    https://learn.microsoft.com/en-us/sql/relational-databases/import-export/bulk-import-and-export-of-data-sql-server?view=sql-server-ver16


  4. Bruce (SqlWork.com) 61,731 Reputation points
    2022-07-19T22:03:37.073+00:00

    your issue is you are reading all the csv rows into the datatable, then trying to insert the datatable. you should be using a data reader rather than a data table.

    also bulk load just turns to an insert if the table has indexes. if the table has indexes, you should look at dropping the indexes, load data, recreate indexes.