Customer case study: SAP integration with Dataverse

Overview

This document describes a proven integration pattern for ingesting data from SAP systems to Dataverse.

Business context

The solution is a financial planning and control management system developed in partnership with a Fortune 500 company. Functionality includes a central hub experience to establish budgets, control spend, as well as measure spend performance. It also enables employees to create Purchase Requisitions (PRs).

The Hub experience is built with Microsoft Power Platform, backed by data hosted in Dataverse.

User profile data is pulled from the SAP ERP Central Component (ECC) to pre-populate fields in new PRs.

Both budget and actual spending to date are displayed in PRs, helping users keep expenses in line with budgets. Expense data is pulled from SAP Central Finance (cFIN).

A batch process loads SAP ECC and cFIN data (hosted on S/4HANA) into Dataverse on a daily basis.

Integration options

The team evaluated two options for loading the SAP data into Dataverse:

  1. SAP ADF (Azure Data Factory) Connectors: Azure Data Factory and Azure Synapse Analytics pipelines provide several SAP connectors to support a wide variety of data extraction scenarios from SAP.
  2. SAP Data Services: An ETL (extract, transform, load) tool built within the SAP ecosystem.

SAP Data Services was chosen for the following reasons:

  1. The client has existing SAP BODS capabilities to help with extending the BODS tool to integrate with Azure Data services.
  2. The Azure Data Factory SAP connectors are new and were not yet supported by the client's SAP team.
  3. BODS is already configured on the four SAP ECC instances and the S/4HANA instance.
  4. BODS transforms data at the source, reducing the volume of data that transferred and reducing processing of the data.

Integration with SAP BODS

The team defined a "Common Data Model" during the design phase. The model describes the BODS file name pattern, including the file path to use in blob storage. A CSV format, including data type, is also defined.

Example of a Common Data Model

Interface Name: Supplier Reference

Common Data Model (agreed fields):
Code
Display Name
Supplier Number
Primary Address Street1
Primary Address City
Primary Address State
Primary Address Postal Code
Active Status
Preferred Currency

Underlying SAP Table: Vendor Master - LFA1

Defining this format allowed the customer BODS team and a Microsoft team to work in parallel.

Integration steps

Here are the steps that are performed in this integration:

  1. BODS queries tables and objects from SAP system(s), performing joins, filters, and any other required transformations.
  2. The results are written to a CSV file within the SAP ecosystem.
  3. The CSV file is copied to Azure Blob Storage using AzCopy.
  4. From Blob Storage, Azure Data Factory and Azure Databricks are used to transform and load the data into Dataverse.

Condition Checks