Financial Management Blog Posts by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
paulgabog
Product and Topic Expert
Product and Topic Expert
2,750

Note: This blogpost is applicable for both SAP Datasphere and SAP Datasphere in SAP Business Data Cloud.

Hey everyone, I am back again and this time with SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) - Universal Model (UM) integration 😀

You may occasionally receive integration questions like:
a. I have my input data available in Datasphere (in SAP BDC), how can I push it to UM?
b. Now I get the results of my UM calculation logic, how can I pull this back to Datasphere (in SAP BDC)?

I wanted to share with you one solution to answer these questions hence providing you with a how to guide on how Datasphere (in SAP BDC) leverages the Replication Flow in order to perform outbound and inbound of  data to and from SAP PaPM Cloud – UM.

Prerequisite:

1. In your Datasphere (in SAP BDC) Tenant Connections section, Create SAP HANA Connection.

a. Choose Create Connection
b. Choose Feature > Replication Flows
c. Choose SAP HANA

1 - Create SAP HANA Connection v3.png

2. In the popup which appears, fill in the information needed then on the next step feed business name which will be the description of the connection.

Note: Database information and credentials are available from SAP PaPM Cloud Standard Model > Administration > Settings. Use SAP_PAPM_ADMIN as it inherently has database privileges to update data of Z generated tables from SAP_PAPM_UM.

a. For Host, use Database Host.
b. For Port, use Database Port
c. For User Name, use Database user name
d. For Password, use Database user password

2 - Connection Credentials v2.png

3. Choose Next Steps and finish creating connection. Connection will now be available for usage.
3 - Connection Created v2.png

How to push data to Universal Model?

Note: Idea is that the data will be pushed directly to Z_<ENV>_<FUNCTIONID> table of Model Entity generated in Universal Model, hence make the fields including primary key and data types consistent from source table in Datasphere (in SAP BDC). Else you can replicate the source table to your preferred schema and access it via SAP PaPM Cloud - UM HANA Schema Connections.

Now let’s start 😊

  1. Local object (e.g. table) is available in Datasphere (in SAP BDC) which will act as source object.

3.1 Local Table in Datasphere v2.png

2. Once data is prepared, Create replication flow in Datasphere (in SAP BDC).

a. Choose Data Builder
b. Choose Create Replication Flow
c. Choose Select Source Connection

4 - Create Replication Flow and Choose Select Source Connection.png

d. Choose SAP DataSphere
e. Choose Add Source Objects

5 - Select DataSphere and Add Source Object.png

f. Choose the local table created in previous step.
g. Choose OK. In this step, source object can be seen available on the left hand side of Replication Flow Screen.

6 - Select Datasphere Data and choose ok.png

3. Select target object from SAP_PAPM_UM container

a. Choose Button for selecting target connection
b. Select the created connection to Universal Model
c. Choose Button for selecting target container
7 - Choose Target Connection.png

d. Choose SAP_PAPM_UM container
e. Choose the three dots
f. Select map to Existing Target Object
g. Search for the function name or Z artifact to which data from Datasphere (in SAP BDC) be loaded into. Choose the UM generated artifact.
Note: Environment with Model Entity in UM should be activated for Z artifact to appear in the list.
h. Choose Next then on the next page choose Add Selection.
8 - Map to existing UM Z Tables.png

4. Perform mapping of fields

a. Choose the target object for mapping option to appear
b. On the properties panel on the right > Projection > Choose Add button
c. Add name for mapping
d. Choose Mapping Button
e. Perform mapping of source and target fields.
f. Choose Save
g. Deploy the replication flow
h. Run the replication flow

9 - Projection mapping Source and Target , Deploy and Run.png

5. After competion of running replication flow, check database artifact or Show Data of Function where data is pushed into. Data is now available in Universal Model.
10 - Data now available in Universal Model.png

 

How to pull data from Universal Model?

1. After calculation logic, resulting data is available from Universal Model.

11 - UM data Source Profit Allocation Results.png

2. Given the knowledge we learned from push scenario, Create Replication Flow in Datasphere (in SAP BDC) but this time UM will be the source and Datasphere (in SAP BDC) will be the target. Deploy then Run.

12 - Set up UM source and DSP Target.png

Note: In this example, I replicated the UM table into Datasphere (in SAP BDC) by taking over the whole table with a little adjustment on the naming preference hence no mapping into an existing Datasphere (in SAP BDC) table happened. If a local table is available as target table like in push scenario you can still proceed with doing so just like in push scenario.

3. Data from Universal Model is now available in Datasphere (in SAP BDC) for further usage.

13 UM Data Available in datasphere.png

In conclusion, the integration scenario presented above illustrates one way of making  SAP PaPM Cloud Universal Model Data <> Datasphere (in SAP BDC) available for each other’s usage. As a disclaimer i'm not a Datasphere (in SAP BDC) expert and there can be more optimized way to do this but this example is worth sharing as far as my investigation is concerned 😊 I hope this example inspires you to explore other integration options. You’ll hear from me again soon!