SSIS ODBC denodo error to sqlserver OLEB -please assist

Moodley, V. (Vilen) 1 Reputation point
2021-06-29T16:29:40.997+00:00

Hi

I currently have a VS 2017 SSDT and ODBC(32/64) bit set up as we sourcing data via Denodo which is from oracle environment.

I have ODBC connection manager set up and on my data flow ODBC source and OLEB destination

The source returns 200million rows of data accross 6 columns. Please note im also using a VPN connection.

It runs up until 7million or sometimes less and fails with this error:

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0x80004005. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

[SSIS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 12 buffers were considered and 12 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

Please assist on how one can resolve this?

Thanks

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,525 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. ZoeHui-MSFT 35,556 Reputation points
    2021-06-30T05:57:22.75+00:00

    Hi @Moodley, V. (Vilen)

    The error message is general for us to narrow down the issue.

    As you said that you have installed both ODBC(32/64bit).

    Could you try to use OLE DB Connector for ODBC as said here?

    https://support.esri.com/en/technical-article/000002592

    The Information is a warning for memory.

    Please refer kb977190 to see the workaround.

    Regards,

    Zoe


    If the answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.
    Hot issues October

    0 comments No comments

  2. Olaf Helper 43,246 Reputation points
    2021-06-30T06:17:52.087+00:00

    Either not enough memory is available to the pipeline

    In data flow task => OleDB destination change the property "Rows per batch" and "Commit Size" to smaller values to reduce memory consumption.
    See OLE DB Destination => "Fast Load Options".

    0 comments No comments

  3. Moodley, V. (Vilen) 1 Reputation point
    2021-06-30T07:52:20.123+00:00

    @Olaf Helper
    Thanks i shall try this.

    I am moving across 400million rows (6columns) of data. I am currently using an i7 /8gig RAM.
    What would be appropriate values to insert for commit size and rows per batch in this case?
    Also will a balanced distributor also assist?

    Thanks