Note: This blogpost is applicable for both SAP Datasphere and SAP Datasphere in SAP Business Data Cloud.
Hey everyone, I am back again and this time with SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) - Universal Model (UM) integration 😀
You may occasionally receive integration questions like:
a. I have my input data available in Datasphere (in SAP BDC), how can I push it to UM?
b. Now I get the results of my UM calculation logic, how can I pull this back to Datasphere (in SAP BDC)?
I wanted to share with you one solution to answer these questions hence providing you with a how to guide on how Datasphere (in SAP BDC) leverages the Replication Flow in order to perform outbound and inbound of data to and from SAP PaPM Cloud – UM.
1. In your Datasphere (in SAP BDC) Tenant Connections section, Create SAP HANA Connection.
a. Choose Create Connection
b. Choose Feature > Replication Flows
c. Choose SAP HANA
2. In the popup which appears, fill in the information needed then on the next step feed business name which will be the description of the connection.
Note: Database information and credentials are available from SAP PaPM Cloud Standard Model > Administration > Settings. Use SAP_PAPM_ADMIN as it inherently has database privileges to update data of Z generated tables from SAP_PAPM_UM.
a. For Host, use Database Host.
b. For Port, use Database Port
c. For User Name, use Database user name
d. For Password, use Database user password
3. Choose Next Steps and finish creating connection. Connection will now be available for usage.
Note: Idea is that the data will be pushed directly to Z_<ENV>_<FUNCTIONID> table of Model Entity generated in Universal Model, hence make the fields including primary key and data types consistent from source table in Datasphere (in SAP BDC). Else you can replicate the source table to your preferred schema and access it via SAP PaPM Cloud - UM HANA Schema Connections.
Now let’s start 😊
2. Once data is prepared, Create replication flow in Datasphere (in SAP BDC).
a. Choose Data Builder
b. Choose Create Replication Flow
c. Choose Select Source Connection
d. Choose SAP DataSphere
e. Choose Add Source Objects
f. Choose the local table created in previous step.
g. Choose OK. In this step, source object can be seen available on the left hand side of Replication Flow Screen.
3. Select target object from SAP_PAPM_UM container
a. Choose Button for selecting target connection
b. Select the created connection to Universal Model
c. Choose Button for selecting target container
d. Choose SAP_PAPM_UM container
e. Choose the three dots
f. Select map to Existing Target Object
g. Search for the function name or Z artifact to which data from Datasphere (in SAP BDC) be loaded into. Choose the UM generated artifact.
Note: Environment with Model Entity in UM should be activated for Z artifact to appear in the list.
h. Choose Next then on the next page choose Add Selection.
4. Perform mapping of fields
a. Choose the target object for mapping option to appear
b. On the properties panel on the right > Projection > Choose Add button
c. Add name for mapping
d. Choose Mapping Button
e. Perform mapping of source and target fields.
f. Choose Save
g. Deploy the replication flow
h. Run the replication flow
5. After competion of running replication flow, check database artifact or Show Data of Function where data is pushed into. Data is now available in Universal Model.
1. After calculation logic, resulting data is available from Universal Model.
2. Given the knowledge we learned from push scenario, Create Replication Flow in Datasphere (in SAP BDC) but this time UM will be the source and Datasphere (in SAP BDC) will be the target. Deploy then Run.
Note: In this example, I replicated the UM table into Datasphere (in SAP BDC) by taking over the whole table with a little adjustment on the naming preference hence no mapping into an existing Datasphere (in SAP BDC) table happened. If a local table is available as target table like in push scenario you can still proceed with doing so just like in push scenario.
3. Data from Universal Model is now available in Datasphere (in SAP BDC) for further usage.
In conclusion, the integration scenario presented above illustrates one way of making SAP PaPM Cloud Universal Model Data <> Datasphere (in SAP BDC) available for each other’s usage. As a disclaimer i'm not a Datasphere (in SAP BDC) expert and there can be more optimized way to do this but this example is worth sharing as far as my investigation is concerned 😊 I hope this example inspires you to explore other integration options. You’ll hear from me again soon!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
3 | |
3 | |
2 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |