Copy large volume data from a Composite Provider to Azure Data Factory - delta load required, but no delta option in Open Hub, Data Transfer Process

Andreea-Gabriela Zota 1 Reputation point


I am SAP BI Consultant at Price Waterhouse Coopers.

We are trying to load data from a Composite Provider to Azure Data Factory using Open Hub, copying over 2 million records, therefore, delta load will be necessary, but there is only a 'Full' extraction option available in the DTP, using Composite Provider as a Template.

Do you have any thoughts on how we should proceed? There is no material online that shows what we should do in our scenario, using a large Composite provider (only an Infocube as in

Best Regards,
Andreea Zota

SAP HANA on Azure Large Instances
SAP HANA on Azure Large Instances
Microsoft branding terminology for an Azure offer to run HANA instances on SAP HANA hardware deployed in Large Instance stamps in different Azure regions.
120 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,082 questions
{count} votes

1 answer

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,056 Reputation points

    I think I found the root cause. Delta for CompositeProviders is supported only under specific conditions. This is a restriction on the SAP side, preventing egress.

    In delta mode, the DTP supports CompositeProviders (object type HCPR) as sources if they are made up entirely of InfoCubes or of DataStore objects (advanced) that are modeled like standard InfoCubes or standard DataStore objects (classic). The target of the DTP must be a DataStore object (advanced). A combination of InfoCubes and DataStore objects (advanced) as PartProviders of a CompositeProvider is not supported for DTPs in delta mode.


    I found a blog detailing workarounds for various Composite Provider limitations. I am unsure whether item number 2 applies to this or not.