Technology Blog Posts by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
andreas_krompholz
Product and Topic Expert
Product and Topic Expert
2,486

SAP Datasphere has recently introduced the new connection type "SAP Signavio," enabling Replication Flows to replicate data directly to SAP Signavio Process Intelligence. This allows CDS Views and ABAP tables from SAP ERP systems to be replicated in a manner similar to any other replication use case in Datasphere.

On the SAP Signavio Process Intelligence side, the data will be pushed by the replication flow into the object store of the customer's Signavio workspace. A replication flow can then be mapped to a source data, providing the data from the corresponding ABAP tables and CDS Views to the Signavio ETL Pipeline to generate the process mining event log.

High-Level component overviewHigh-Level component overview

The diagram illustrates the complete data flow:

  • Left: Source systems (S/4HANA Public Cloud for CDS Views; S/4HANA/ECC with SLT for ABAP tables)
  • Center: SAP Datasphere orchestrates the replication flow
  • Right: SAP Signavio Process Intelligence receives data as Source Data objects
  • The dashed vertical line represents the boundary between customer landscape and internet connectivity via SAP Cloud Connector

We demonstrate now how to create a replication flow for:
(a) the customer object (CDS View i_customer) in S/4HANA Public Cloud Edition.
(b) the sales order object (ABAP tables VBAK and VBAP) in S/4HANA or SAP ECC 

Prerequisite:

  • We have a user and space in the SAP Datasphere service on BTP and have the authorization to create a new connection and to create and execute replication flows. Note that we use Replication in the pass-through option and don't require a SAP HANA as staging area within SAP Datasphere (see Replication Flow Blog series).
    We have noted down the Datasphere Tenant ID (System => About => Tenant).
  • For the replication of CDS views we have configured the integration scenario SAP_COM_0532 in our S/4 HANA Public Cloud Edition (see blog S/4HANA Public Cloud Integration with SAP Dataspehere )
  • For the replication of ABAP tables we have configured the SAP Landscape Transformation component (SLT) in our S/4HANA on-premise or use a standalone SLT system connected to our S/4 HANA system. We have enabled connectivity to the ABAP landscape via SAP Cloud Connector to SAP Datasphere (see Replicating table data from an SAP ECC system with SAP Datasphere using Replication Flows)
  • In SAP Signavio we have access to a workspace subscribed to Process Intelligence and have the authorization to perform Process Data Pipeline steps

Connectivity Handshake Signavio <-> Datasphere:

To establish a secure connection between SAP Datasphere and SAP Signavio, an OAuth Flow is required.  To do so, we start with creating an OAuth Client in our Datasphere tenant. Navigate to System => Administration and choose tab App Integration. When clicking on "Add an OAuth Client" you create your OAuth Client for Signavio. Give it a name, choose the purpose "Technical User", enter a name for the technical user to be created and assign the appropriate roles (refer to the documentation to assign the requires scoped integrator role).

andreas_krompholz_0-1762446651275.png

Once the system has created the OAuth Client, it provides you with the OAuth credentials. Make sure to copy the credentials and note it down, you need it in the next step.

We now switch to Signavio and use the created OAuth Client to establish a secure connection. Create a "SAP Datasphere Replication Flow" connection in Signavio Process Intelligence. There is also a "SAP Datasphere" connection which relates to read the data from the SAP HANA instance in SAP Datasphere. Mind, this is not the integration flow we are looking into here and make sure to select the one with replication flow.

andreas_krompholz_0-1740759056297.png

 

 

 

 

 

 

 

 

In the next screen we enter the OAuth Client Credentials we have noted down earlier:

andreas_krompholz_0-1762447413755.png

  • OAuth Client ID: Client ID you have noted down in the previous step
  • OAuth Client Secret: secret you have noted down in the previous step
  • Service Root: URL root of the Datasphere App Integration page where you create the OAuth client
  • OAuth Access Token URL: Token URL you find in the overview of the OAuth Clients in SAP Datasphere (ends with /oauth/token)

After we have created the connection in Signavio we have now established the OAuth connectivity handshake between Signavio and Datasphere and can work on a secure connection.

Let's now create the corresponding Signavio connection in Datasphere.

We logon to our SAP Datasphere instance and select our space where we create the connections and replication flows.
In the left bar menu we choose Connections and create a new connection of type Signavio. 

andreas_krompholz_3-1740759811819.png

In the following screen we enter the connection URL of the respective Signavio landscape. In our case it is EU which is the URL https://api-mtls.eu.signavio.cloud.sap. You find the right API-Gateway endpoints in the SAP Signavio documentation for regions, IP addresses and URLs.

andreas_krompholz_0-1740760287909.png

After entering a name for the connection we can save and validate the connection:

andreas_krompholz_1-1740760449740.png

The Datasphere tenant is automatically mapped to the corresponding Signavio connection where we had entered the respective Datasphere tenant ID. We can not map this Datasphere tenant to another Signavio workspace unless we delete the connection in Signavio.

Replicate customer information (CDS View i_customer)

After establishing the Signavio connection we create a connection to the SAP S/4 HANA system to replicate the CDS View i_customer. We create a new connection of type S/4HANA Public Cloud:

 

andreas_krompholz_1-1740761242263.png

We use certificate or user/password from the Communication Arrangement for Communication Scenario 0532.

Once created and validated we can continue to configure our Replication Flow for the CDS View i_customer. Choose Data Builder and new Replication Flow:

andreas_krompholz_2-1740761673688.png

We perform the following steps to configure the source of Replication Flow:

  1. Select your S/4HANA Public Cloud as source connection
  2. Choose CDS View Extraction as source container
  3. Select CDS View I_CUSTOMER 

andreas_krompholz_3-1740761896137.png

We configure now:

  • Load Type: Initial and Delta
  • Our recently created Signavio Connection as target
  • Content-Type "Template Type" to allow ABAP Types transformation such as Date/Time in Signavio
  • the name of the Replication Flow

andreas_krompholz_4-1740762172075.png

Once we have saved and deployed the Replication Flow we run it. In the Integration monitor we check the status of the replication:

andreas_krompholz_0-1740762663108.png

Once initial load is finished the delta handling will regularly replicate the changes to Signavio.

Let's now switch back to SAP Signavio Process Intelligence and create a corresponding source data. 
We crate a new source data for our new replication flow RF_Customer and assign the earlier created connection to it.

andreas_krompholz_0-1740999962178.png

Once created we can sync the data which results in processing the data provided by the replication flow from SAP Datasphere. In particular, the initial load and possible deltas will be merged and we get the replica of the ABAP source data in Signavio. It may take some minutes depending on the size of the CDS views.

andreas_krompholz_1-1741000318768.png

Mind, that whenever the Replication Flows send delta information a new sync is required to update the source data.

To validate the replication or use the data we have to create a Process Data Pipeline and create a SQL script. We create a new Process Data Pipeline based on a blank template and choose our source data RF_Customer.

andreas_krompholz_2-1741000584977.png

We create a new Process Data Model with business object Customer and open the SQL Editor for a new event collector.

andreas_krompholz_0-1741008005521.png

We can now use the preview option to validate the extraction, e.g. showing Top100 entries, count the replicated entities, etc. 

Replicate sales order information (ABAP tables VBAK and VBAP)
In the previous chapter we have successfully replicated the data from a CDS View. If we have an ABAP table to be replicated we have to leverage the SAP Landscape Transformation Server (SLT) integration to Datasphere.

Assuming we have our SLT component configured in our ABAP system (or use a dedicated SLT server) we can create a new mass transfer ID in SLT. We create it using transaction code ltrc.

andreas_krompholz_0-1741008776509.png

Click on Create Configuration and enter the following information in the subsequent screens:

  • Configuration Name and Description
  • RFC Destination and (optionally) Read from Single Client flag
  • SAP Data Intelligence (Replication Management Service) under Other in Target System
  • Job options (we may start with 1 job first)

We have finished our SLT preparation step. All following replication configurations (tables, fields, filters, etc.) will now be configured in the Replication Flow in Datasphere which will be then pushed down to your SLT configuration. Let's switch to our Datasphere tenant and create a new connection to our SLT instance. 

When creating a new connection choose SAP ABAP as system type:

  • Protocol: RFC
  • Connection Type: Application Server (or Message Server if you prefer)
  • Application Server, System Number and Client
  • Choose Cloud Connector to allow on-premise connectivity
  • Authentication Type: for user/password choose your ABAP user for SLT with appropriate roles

We can save and validate our new connection.

andreas_krompholz_0-1741106711528.png

 

Once connection is established, we create a new Replication Flow for our ABAP table replication. Create the replication flow similar to the CDS View, but use SLT as source container:

  • Business Name: RF_SalesOrder
  • Source Connection: our SLT connection
  • Source Container: "SLT - SAP LT Replication Server"
    • our mass transfer ID from the ABAP system
  • Source Objects: ABAP Tables VBAK and VBAP 
  • Load Type: Initial and Delta
  • Content Type: Template Type
  • (optional): add a projection to filter the data or reduce the number of fields
  • Target Connection: your Signavio connection (same as used for CDS View replication)

andreas_krompholz_1-1741107242133.png

Optionally you may add filters or field projections which will be pushed down to SLT and reduces the data volume to be transferred.

Save, Deploy and run the replication flow. To monitor the transfer you can choose Data Integration Monitor in Datasphere and the monitoring tool in SLT. Data will be gathered by SLT and Datasphere can then pull the data to Datasphere and directly forward it to Signavio.

andreas_krompholz_0-1741339126908.png

You see the replication status of each table and some metrics on the replication run, e.g. 886.757 entries from VBAK had bee initially replicated in 5:12 minutes using 9 partitions, 3 delta updates have been already executed.

Once data is replicated the Replication flow will show status "Active (retrying objects)" as we have chosen Initial and Delta Mode. The Replication Flow continuously checks for updates in the ABAP Tables and replicate them to Signavio.

Let's switch to Signavio and create a corresponding source data for our replication flow for the sales order ABAP Table.

andreas_krompholz_0-1741185324748.png

We have selected the same Datasphere connection as we have used for the CDS Views, and the RF_SalesOrder replication flow.

In the following screen we sync the replication flow. 

andreas_krompholz_1-1741339746747.png

Once the sync is being processed we can see the tables VBAK and VBAP alongside with the sync log.

andreas_krompholz_0-1741340270266.png

 

With having the source data prepared for further processing we can now create a Process Data Pipeline to generate the process mining event log.

In our pipeline we crate a new business object SalesOrder and create the SQL scripts for case and event collectors. As an example we create the event collector "create sales order".

andreas_krompholz_1-1741340951319.png

We've demonstrated how to leverage SAP Datasphere to replicate and synchronize data from source systems using Replication Flow. Give this method a try and share your results and experiences in the comments below!