SSIS "Execute Package task" in a "for each file" loop failed when loading large number of small files

Cédric D 0 Reputation points
2023-05-05T08:30:40.8033333+00:00

Hello

I have a SSIS package including an "Execute Package Task" (ExecuteOutOfProcess = False) to be executed as many times as there are files in a specific folder. enter image description here enter image description here

The sub package task is a very simple Package including a DFT that loads data from flat file (.csv) into SQL Server Table enter image description here enter image description here

It runs correctly with limited number of files, but now I have to load more than 2.000 files (each one has less than 1000 rows and < 1Mb) and I get the following messages after processing around 500 files

message -1 The buffer manager failed a memory allocation call for 76480000 bytes, but was unable to swap out any buffers to relieve memory pressure. 3 buffers were considered and 3 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

message -2 Buffer manager allocated 0 megabyte(s) in 0 physical buffer(s).

message -3 The system reports 64 percent memory load. There are 68718374912 bytes of physical memory with 24701116416 bytes free. There are 2147352576 bytes of virtual memory with 49799168 bytes free. The paging file has 77308309504 bytes with 33875689472 bytes free.

It seems like SSIS is not cleaning the virtual memory after a file is processed/loaded and when it reaches its limit, the Package fails because of insufficient resources. I cannot change resource allocation because the server is shared.

The first thing I tried was to execute the Package task with ExecuteOutOfProcess = True but I get lots of other errors (connection, logging, etc) which are very difficult to identify because debugging is not possible anymore when starting the package via Ms Visual Studio. Besides, I'm not sure parallel processing is a good idea as all files will be loaded to the same SQL table

The (sub-)package seems to be executed with a different user (?) that isn't allowed to use one of the connection manager (csv file or SQl db). I also have the feeling that parameter binding is not functioning well anymore

message Executing the query "" failed with the following error: "Unable to acquire a managed connection.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

I also played with AutoAdjustBuffersize = True or False , reduced DefaultBufferSize to the minimum of 1048576. It changes the number of files that can be loaded but it still fails at some point with the same error.

Could anyone help me on that please?

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,525 questions
{count} votes

1 answer

Sort by: Most helpful
  1. ZoeHui-MSFT 35,556 Reputation points
    2023-05-08T06:13:02.2566667+00:00

    Hi @Cédric D,

    From the error message, the issue maybe related with memory.

    Other than DefaultBufferMaxSize and DefaultBufferMaxRows, BLOBTempStoragePath and BufferTempStoragePath are also two properties on Data Flow Task. They define where on the hard disk(s) to page buffer data to. BLOBs (text, next, image) are written to BLOBTempStoragePath. All other buffer data will be swapped to BufferTempStoragePath.

    For better performance, we recommend both BLOBTempStoragePath and BufferTempStoragePath point to fast drives.

    Refence: sql-server-integration-services-ssis-performance-best-practices

    Regards,

    Zoe Hui


    If the answer is helpful, please click "Accept Answer" and upvote it.

    0 comments No comments