Learn SAP from the Experts | The SAP PRESS Blog

Integrating SAP IBP with SAP Cloud Integration

Written by SAP PRESS | Feb 17, 2020 2:00:00 PM

SAP Cloud Integration for data services is required for automated data integration to SAP Integrated Business Planning from an SAP environment or from another legacy application.

Integration through SAP Cloud Integration for data services is executed through batch jobs. The frequency of the data transfer depends on the frequency adopted for batch job execution.

3

For data integration using SAP Integration Suite, you’ll perform the following steps:

 

1 Install and Configure the Data Services Agent for SAP Cloud Integration

Installing an agent for SAP Cloud Integration data services enables the data transfer between an on-premise data source and a cloud-based SAP IBP system. These agents help enable the secure transfer of data between on-premise data sources and SAP Integration Suite. To install and configure agents, follow the steps provided in the Data Service Agent Guide at http://s-prs.co/v533807. After successful installation and configuration, this conduit for data movements can be used for data flows.

 

2 Create the Data Store

Data stores are created to connect SAP Cloud Integration for data services with your application and database. Through a data store connection, SAP Cloud Integration for data services can read data from the data store and write this data to the SAP IBP system as well as take the data from SAP IBP and write it back to the on-premise data system. Data stores are created in the SAP Cloud Integration for data services web user interface (UI). The objects or data elements in data stores are available for use by importing the objects via SAP Cloud Integration for data services. For detailed steps on this process, refer to the technical configuration guide.

 

3 Create the Project, Task, Data Flow, and Process

A project created in SAP Cloud Integration is a container to hold similar processes and tasks. The figure below shows the conceptual functionalities of the project, process, task, and data flow. A task is created under a project. The data flow maps fields with the extract, transform, and load (ETL) logic.

 

 

The next figure shows an example of a project (1), a process (2), and a task (3). The Edit, View, Schedule, Run Now, View History, and More Actions buttons (4) provide access to multiple operations related to tasks and processes, such as editing a task, executing a process chain, viewing the history of previous runs, or promoting objects to the production environment (5). The data flow of a task allows you to map source and target fields through an easy drag-and-drop functionality.

 

 

The following figure shows a typical data flow for a location master data task. A process is created by assigning multiple tasks with a start and end relationship. You can perform data field mapping for the source and target using drag-and-drop technology. You can execute multiple tasks in a required sequence.

 

 

The final figure shows a process with multiple master data tasks assigned with a start-to-end relationship in a linear fashion. In this example, the task for transferring location/product master data, represented by DF_LOCPRD, is executed after the execution of the location, customer, and product tasks.

3

 

4 Execute the Process/Task

The execution of a task performs the data transfer for the data element selected in the task. A global variable is part of the execution property of a task and controls the logic for task execution for the planning area, based on the nature of the update, among other things. The nature of the update controls whether the data transfer follows an insert/update, replace, or delete methodology.

 

The Insert/Update option adds data to the target table from the source table without impacting the existing data in the target system. The Replace option removes existing data from the target system while loading the data from the source system. The Delete option will only remove the data in the target system while using the data from the source system.

 

Processes and tasks can be executed manually on demand or can be scheduled through batch jobs. A batch job can be scheduled for a single execution or can be executed periodically. Whether a task succeeds or fails is displayed in SAP Cloud Integration for data services through a symbol and a color (green for success and red for failure). The SAP Cloud Integration for data services dashboard also shows analytics about previously completed tasks and provides information about tasks scheduled for execution.

 

Editor’s note: This post has been adapted from a section of the book SAP Integrated Business Planning: Functionality and Implementation by Sandy Markin and Amit Sinha.