Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
The DI (Data Intelligence) RMS (Replication Management Service) provides initial loads or initial  and delta, sometimes a scheduled delta is preferred.

In this blog post I will describe how we can achieve a scheduled delta with an RMS Flow and/or RMS Tasks. At the time of writing (November 2022) the Data Intelligence scheduler doesn’t allow that, which why this blog exists.

  1. Identify the Internal API Calls

  2. Additional API Calls Identified

  3. Data Intelligence pipeline for scheduling

1. Identify the Internal API Calls

Figure 1.1 RMS Flows vs RMS Tasks

For clarity Figure 1.1 shows the relationship of RMS Flows to RMS Tasks, one flow can contain multiple tasks.

The official public DI API does not cover the RMS (Replication Management Service), this requires using the internal (unsupported) API, this has worked flawlessly in my testing. In fact most actions within Data Intelligence use internal APIs and that includes the management of RMS Flows and the RMS Tasks.

To understand the RMS internal API, check the browser developer tools and inspect the API calls.
Here, the GET calls are mainly used to populate the UI and the PUT stop and start the flows or tasks.

Figure 1.2: Developer Tools, Network trace

The requirement I have is to stop and start an entire RMS Flow. This is performed by the PUT calls, using the query parameter requestType. The flow name makes up the last part of the URL The resulting API call is below.

requestType paramers



As described in the Pubic Data Intelligence API, to use a Data Intelligence PUT API call you need to use the header as below.
x-requested-with: Fetch

Initial testing was done with Postman to confirm the API behaviour.

Figure 1.3: Postman to Validate API Calls


2. Additional API Calls Identified

Here are some additional API calls that were identified and could be useful to enhance the workflow and/or capture some statistics as to the performance of the replication process. These are not necessary for the the scheduling process, more for reference.

GET all flow statuses
source, target, status, change date, user

GET specified flow details and tasks
source, target, connection, container, output format, load type

GET specified flow configuration
Priority, max connections source and target

GET (Monitor) specified flow
source, target, connection, container, output format, status, duration

GET (Monitor) tasks in specified flow
Source, target, priority, load type, status, number of records, number of partitions, duration, bytes sent

GET Status of last flow (PUT) action
Status, time of change

PUT start specified flow

PUT stop specified flow

PUT start specified task(s) within flow
In the request body you need to specify the task name, the name is the name as shown from the “Get specified flow details and tasks” API call

PUT stop specified task(s) within flow
In the request body you need to specify the task name, the name is the name as shown from the “Get specified flow details and tasks” API call


3. Data Intelligence pipeline for scheduling

To be able to schedule these actions I have created a simple pipeline to call these APIs and log the output of the call. The pipeline, which triggers the RMS flow can then be scheduled to run at the required times.

Using the Data Intelligence connection management provides a secure place to store the credentials.

Figure 3.1: Open API Connection

We can then use this connection in the Open API pipeline operator. The flowname is specified as a placeholder parameter, this will be set at runtime or during scheduling.

Figure 3.2: OpenAPI Pipeline

The operator is built using JavaScript, which makes it lighweight and easily transportable between environments.

Figure 3.3: Set Header Code

The set header operator passes two attributes required for the API, the header and the query parameter requestType.

Figure 3.4: Log API Output

I captured the API Response to monitor how the API responded to the request. Storing this in the pipeline log provides tracability.

This pipeline can now be run directly or scheduled to stop and start the RMS Flows. It should have the same effect as doing this from the User Interface. This pipeline does not check the status or activity of the RMS Flow, but it does allow the RMS Flow to be scheduled. One pipeline would start the RMS flow running and another pipeline stops it.

Figure 3.5: Schedule pipeline

I have placed the pipeline json file in a shared github, this can be imported directly into the Data Intelligence Modeler.


With a small amount of development effort, we can interact with the RMS internal API and stop and start the RMS Flows.  Far more is possible with the RMS API, including logic that captures the RMS performance, tracked the activity and status.