Package fails with memory error if AutoAdjustBufferSize is True but works if False

Nick Ryan 221 Reputation points
2021-01-19T19:35:25.393+00:00

This one is made all the more strange by the fact that the same package loading the same data works on our OAT server.

SQL Servers are at the same patch level: Microsoft SQL Server 2016 (SP2-CU15) (KB4577775) - 13.0.5850.14 (X64) Sep 17 2020 22:12:45 Copyright (c) Microsoft Corporation Enterprise Edition: Core-based Licensing (64-bit) on Windows Server 2012 R2 Standard 6.3 <X64> (Build 9600: ) (Hypervisor)

Dataflow that fails reads from a staging table somewhere between 2.5 million and 4 million rows and writes to a columnstore Fact. There are a number of lookups fully cached but the dimension tables are a trivial size.

The SELECT from the staging table loads 61 columns. I'm not sure how I'd tell the average size of a row. I've removed all columns that are not used in the dataflow.

If I change the Auto Adjust back to False, it works fine.

DefaultBufferMaxRows is 1048576 and DefaultBufferSize is 10485760. The OLE DB Destination has Rows per batch blank and Maximum insert commit size 0. All these settings I've made after reading Niko Neugebauer's blog on columnstore indices in order to get the rowgroups as full as possible.

SSIS servers have identical specs and versions and 32GB of RAM. There was nothing else running on either SSIS server at the same time as the failed job.

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,525 questions
{count} votes

5 answers

Sort by: Most helpful
  1. Yitzhak Khabinsky 25,731 Reputation points
    2021-01-19T20:16:39.853+00:00

    Are you running SSIS packages in 64-bit mode?
    Additionally, RAM could be fragmented.

    There is a useful RAMMap utility by Microsoft to analyze what is going on with memory.
    Download link: rammap

    • You can install it on both servers and compare what is going on with memory on both servers.
    • RAMMap allows to empty memory via Empty menu entries.

    UPDATE
    (1) You can try to use SSIS Balanced Data Distributor: ssis-balanced-data-distributor-overview

    The OLE DB Destination

    (2) What OLEDB Provider are you using?
    It is better to use a newer Microsoft OLE DB Driver for SQL Server.
    It is available since 2018. Everything else is deprecated.
    https://learn.microsoft.com/en-us/sql/connect/oledb/oledb-driver-for-sql-server?view=sql-server-2017
    "...
    Important
    The previous Microsoft OLE DB Provider for SQL Server (SQLOLEDB) and SQL Server Native Client OLE DB provider (SQLNCLI) remains deprecated and it is not recommended to use either for new development work..."

    0 comments No comments

  2. Nick Ryan 221 Reputation points
    2021-01-20T03:10:09.267+00:00

    Thanks for those thoughts. Unfortunately I don't have access to the Production server so I'd have to leave it up to the DBAs to do that kind of research.

    I was interested in your comment about whether I ran in 64-bit mode, or not. I don't because the staging table is loaded from Teradata using an ODBC driver. We only have the 32 bit driver installed on the production server. Getting new software loaded is possible but a lengthy process.

    So, what I could do, is split the package into 2 new ones. Load staging in the first running as 32-bit and the second part that's causing the problem loading the Fact as 64-bit.

    But please excuse my ignorance, would running as 64-bit make better use of the available memory and is that why you asked?


  3. Monalv-MSFT 5,896 Reputation points
    2021-01-20T06:39:21.237+00:00

    Hi @Nick Ryan ,

    1.About AutoAdjustBufferSize, DefaultBufferMaxRows and DefaultBufferSize, please refer to Data Flow Performance Features.

    2.We only have the 32 bit driver installed on the production server.
    Load staging in the first running as 32-bit and the second part that's causing the problem loading the Fact as 64-bit.

    Please install 64-bit driver if you want to run the job as 64-bit.

    Best Regards,
    Mona


    If the answer is helpful, please click "Accept Answer" and upvote it.
    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments

  4. Nick Ryan 221 Reputation points
    2021-01-20T06:46:21.723+00:00

    When I say I only have the 32-bit driver installed, I'm referring to the ODBC driver for Teradata. We have 64-bit drivers for SQL Server. That's why I think the answer will be to split the package in 2 so I can run the insert into my SQL Server Data Warehouse with the package running 64-bit.

    I will also look at Yitzhak's suggestions above but I think the answer will be to run in 64-bit for the insert part that's failing.

    I don't know what version of the OLE DB driver we have installed. I expect it is current as our DBA team is usually on-to-it with such things.


  5. Nick Ryan 221 Reputation points
    2021-01-20T19:21:44.66+00:00

    Prod was running in 32-bit but OAT was running in 64-bit hence the difference in behaviour. Solution is to install 64-bit ODBC driver for Teradata and change the package to run in 64-bit.