Replication flow is one of the main artifact to ingest data into SAP Data Sphere from Source systems (SAP or Non SAP system).
There are two types of extraction types Initial Only / Initial and Delta, based on the 'Source Connection' type Delta option will be enabled in Replication Flow.
Initial only - At every instance all the records will be transferred into Data Sphere from source object.
Initial and Delta - This job executes in two steps.
1. At first instance, all the available records will be transferred into Data Sphere from source object.
2. After the first step, only Delta job will be triggered to capture only the changes in the records which are new, deleted and modified.
In this blog, I will try to explain creation of a Replication flow step by step using ABAP connection with an use case to ingest Master Data (Info Object) from SAP BW system to SAP Data Sphere with Replication Flow.
Use Case:
Source: Master Data (ZCOUNTRY) in SAP BW system
Target : Table in SAP Data Sphere
Extraction: Replication Flow
Connection Type: ABAP
Delta Option : Not available for master data objects with ABAP connection type
Below are the steps involved to create Replication flow in SAP Data Sphere.
------------------------------------------------------------------------------------------------------------------------
Step 1. Selecting the Replication flow
Login to SAP Data Sphere -> Main menu -> Data Builder -> Replication Flow
Step 2: Choose the connection
Select source connection
Note: Connection type with SAP ABAP must be configured from SAP BW to SAP Data Sphere as a pre requisite.
Select Source Container - which is configured for BW
Select Add Source objects
Select the info object ZCOUNTRY based on the info area from BW System, here import will be at master data table level.
ZCOUNTRY$P - Attribute table
ZCOUNTRY$T - Text table
Select and click on next to import
Step 3: Configuring Replication Flow
Once the selected tables are imported, target table in Data sphere must be selected to store data in a local table.
Select Target Connection -> Select Local Repository configured for Data Sphere
After selecting the target connection, the target tables will be available with one to one mapping from source.
There are two options available.
Projections
While loading data, filtering or mapping conditions can be altered
Filtering based on required values
Mapping - Target column names, data type etc
Load Type is the extraction method where the Delta job can be configured.
However with ABAP connection type for master data objects only one setting is available that is 'Initial Only' which means all the records will be transferred at every data load.
Save the Replication flow in the required folder and deploy the artifact. After deployment notification, run the job and monitor via Tools.
Deployment notification
Step 4: Monitoring the Replication flow
Below is the monitoring screen where extraction details like runtime, number records, loading status, partitions details are available.
At the time of extraction
After extraction
Step 5: View the data in the target table.
Go to Data Builder -> Select the table from the saved folder -> click on view data
Data in the source table (SAP BW)
In the following blogs, i will try to explain the Delta configuration and additional settings in the Replication Flow.
Thanks a lot for you time in reading through the content.
Regards
Lokesh Kumar Pothapola
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 26 | |
| 26 | |
| 21 | |
| 21 | |
| 19 | |
| 14 | |
| 14 | |
| 14 | |
| 14 | |
| 10 |