How to connect a SAP SLT hosted on GCP with Azure Data factory

Ganesan, Jothy 100 Reputation points
2024-06-19T09:43:28.4866667+00:00

We have a SAP SLT box hosted on GCP. The data needs to be moved to Azure Lakehouse architecture. Can this be connected using ADF via SHIR setup? Will SHIR be able to connect the SAP SLT hosted on GCP with Azure?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,980 questions
{count} votes

Accepted answer
  1. Harishga 5,420 Reputation points Microsoft Vendor
    2024-06-19T11:31:51.13+00:00

    Hi @Ganesan, Jothy
    To move data from an SAP SLT box hosted on Google Cloud Platform to Azure Lakehouse architecture using Azure Data Factory with a Self-Hosted Integration Runtime, you can follow these steps:

    • Create and configure a SHIR in Azure Data Factory Studio.
    • Download and install the latest version of the private SHIR on a virtual machine or on-premises computer. Ensure the VM or computer has sufficient CPU cores to handle the data extraction throughput.
    • Download and install the latest 64-bit SAP .NET Connector (SAP NCo 3.0) on the VM or computer running the SHIR. During installation, select “Install assemblies to GAC” to ensure the connector is properly registered.
    • Configure a network security rule on your SAP systems to allow connections from the SHIR machine. If your SAP system is on an Azure VM, set the source IP addresses/CIDR ranges to your SHIR machine’s IP address and destination port ranges to 3200,3300.
    • Use PowerShell to test the connection from the SHIR machine to your SAP systems with the Test-NetConnection cmdlet and the appropriate port.
    • On the SHIR machine, edit the C:\Windows\System32\drivers\etc\hosts file to map the SAP system IP addresses to server names.
    • Set up the ADF to use the SHIR for data movement tasks. Ensure that your Data Factory is in the same region as the SHIR if you want to use a shared SHIR from another Data Factory.
    • Utilize the SAP Change Data Capture connector in ADF to extract data from the SAP system. The SAP CDC connector uses the SAP Operational Data Provisioning framework to identify new and changed data efficiently.
    • Run ADF copy activity with the SAP ODP connector on SHIR to extract the raw SAP data.
    • Load the data into Azure Data Lake Storage Gen2 in CSV/Parquet format, which can then be used in the Azure Lakehouse architecture.

    By following these steps, SHIR will be able to connect the SAP SLT hosted on GCP with Azure, allowing for the efficient transfer of data to the Azure Lakehouse architecture. Ensure that all security and compliance requirements are met during the setup and data transfer process.

    Reference
    https://learn.microsoft.com/en-us/azure/data-factory/connector-sap-change-data-capture
    https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/extracting-sap-data-using-the-cdc-connector/ba-p/3644882

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful