Financial Management Blog Posts by SAP
Get financial management insights from blog posts by SAP experts. Find and share tips on how to increase efficiency, reduce risk, and optimize working capital.
cancel
Showing results for 
Search instead for 
Did you mean: 
Angelou
Product and Topic Expert
Product and Topic Expert
0 Kudos
530

Friendly Reminder:

All users, particularly those with technically inclined with Command Line Interface can follow this blog post.

Flexibility of SAP Profitability and Performance Management Cloud (SAP PaPM CLD) has been extended even more with the all-new BYOD feature where you can use your existing Standalone SAP HANA Cloud Database  (Standalone HCDB) or Datasphere HANA Cloud Runtime Database (DSP HCDB) to be the database of your SAP PaPM CLD subscription.

Exposure of  database to SAP Datasphere using your runtime database is already achievable via Smart Data Access(SDA) and Smart Data Integration (SDI).

To know more about this integration, see: Integrating SAP PaPM Cloud & SAP Datasphere (DWC) and StepByStep SAP PaPM Cloud <> SAP PaPM On-Premise Integration.

Meanwhile, if you have a subscription of SAP PaPM Cloud with your own Database (BYOD) using your existing DSP HCDB then you do not have to create a connection anymore in your SAP Datasphere as your existing space is your Database already.

With this BYOD setup alone, you can already create a graphical view manually from the exposed generated artifacts in the space where your Standard and Universal Model HDIs are mapped.

In addition to the flexibility given by the BYOD setup, you can also use and access the feature of SAP Datasphere via Command Line Interface. See, Accessing SAP Datasphere via the Command Line.

By simply knowing the name of the artifacts from SAP PaPM Cloud and running simple command, this Command Line Interface can help us to recreate a local table automatically in SAP Datasphere.

You can even save the commands in your local and reuse it by only changing the name of the artifacts based on your requirements. This approach can save time and reducing the manual effort on the creation via UI.

To achieve this integration, we will be needing to download NodeJS which I will show to you later on the procedure and sap-datasphere-exhibitor.

But before we proceed, what is sap-datasphere-exhibitor? sap-datasphere-exhibitor is designed for exposing remote schema objects into SAP Datasphere repository and remotely create SAP Datasphere objects from specific artifact in SAP PaPM Cloud via Command Line Interface. For more information, see https://www.npmjs.com/package/sap-datasphere-exhibitor  . 

Now, let me show to you how to achieve this integration setup.

Prerequisites:

  1. You have an existing subscription to SAP PaPM Cloud Standard OR Universal Model using BYOD subscription that uses the DSP HCDB. More information: Subscription.
  2. You have an environment on SAP PaPM Cloud Standard OR Universal Model that uses a processing function.

Procedure:

Installation of NodeJS and Authentication to SAP Datasphere

  1. Go to How to Install Node.js for the installation of Node JS,
  1. Choose Prebuilt Installer
  2. Select the latest version and the OS you are using in your computer. (On this example I have the version v21.7.3
  3. Choose Download Node js
    Install Node JS.png

    2.  Open the Node JS command prompt.

    3.  Install the SAP Datasphere CLI by typing “npm install -g @SAP/datasphere-cli” then press enter.
install datasphere cli.png

    4. Create OAuth Client for the usage of SAP Datasphere Command Line Interface, OAuth Client purpose must be Interactive usage. For more information, see Create OAuth 2.0 Clients to Authenticate Against SAP Datasphere.

   5. In the Command Line Interface, use the OAuth Client created to login to Datasphere by typing “Datasphere login”. More information: OAuth Interactive UsageDatasphere login image.png

SAP PaPM Cloud – Generating Artifacts

As mentioned above, Standard and Universal Model can generate an artifact from activated processing functions. The input of these processing functions can be from a Local Model Tables / Entities or coming from an external source like SAP Analytics Cloud(SAC).

Now, I will show to you how to get these generated artifacts that will be replicated to SAP Datasphere via Command Line Interface.

Let me show to you first how it is done in Universal Model.

Standard Model will be added soon.

Universal Model

Modeling Setup

UM Modeling.png

  1. In Universal Model, I have created a SAC Model connection to consume the existing Model I have from SAC together with the reader function that will fetch its data records to Universal Model.
  2. These data records consumed will serve as an Input to our Calculation function that contains a simple calculation.
  3. Activation of the Environment is necessary so the artifacts from our function will be generated to the underlying database of your SAP PaPM Cloud Universal Model.
  4. When everything is setup, you may now proceed on running this function to calculate the data records from the Input function. With this, the calculated data records will be available in the Show data screen of our Calculation function.

Getting the Artifacts from Universal Model

  1. Open the SAP HANA Cloud Database Explorer of the space you have used as the database of your SAP PaPM Cloud subscription and login as the _RT user. These credentials can be seen from SAP PaPM Cloud SM > Hamburger Menu > Settings.
    Settings.png
  2. Follow the steps below to open the available the Views section where the generated artifacts are saved.
  1. Expand the Catalog folder
  2. Choose Views
  3. Search the name of the Artifacts with this format (Z_EXT_(ENVID)_(FID)).
    Searching the Artifacts from UM.png
    Note: The schema where the artifacts will be saved are depending on the HDI Container you have created from your Own Database subscription with SAP PaPM Cloud.

      3.  You can also confirm the content of this artifacts by following the instruction below:

  1. Choose the searched artifact.
  2. Choose Open Data.
    OpenData.png

    Now that we have identified the generated artifact from our processing function of Universal Model, let’s now go back to the Command Line Interface to communicate with SAP Datasphere.

Installation and running of Datasphere-exhibitor

  1. Create Database Analysis User in SAP Datasphere.
  2. Install the Datasphere-exhibitor by typing “npm i sap-datasphere-exhibitor”  in the Command Line Interface.
  3. Prepare the configuration JSON file with SAP HANA Client connection options and put Database Analysis user connection details into it. Example:

 

 

 

 

{
    "host": "00000000-0000-0000-0000-000000000000.hana.prod-xx00.hanacloud.ondemand.com",
    "port": "443",
    "uid": "DWCDBUSER#TEST",
    "pwd": "Password123"
}​

 

 

 

 

      4. Save it as a file, eg: options.json where you have installed the sap-Datasphere-exhibitor.
      5. Run Datasphere-exhibitor command for the remote creation of table of artifact to your SAP PaPM Cloud Datasphere by following this format:

datasphere-exhibitor remote-view-to-local-table --space SPACE_ID --schema CONTAINER_NAME --connection-file ./options.json --artifact XYZ
datasphere-exhibitor remote-view-to-fact-view --space SPACE_ID --schema CONTAINER_NAME --connection-file ./options.json --artifact XYZ --measures FIELD:SUM

With the successful RUN of the Datasphere-exhibitor command above, it will show the details of the artifact accordingly:
Created Datasphere object.png

It also might happen that your requirements in your environment will get bigger and you will be needing a new artifact from your environment.
Bigger Env.png

No worries with that as you can just reuse the command you have performed from previous steps and changing the artifact accordingly.

change command for CLI.png

After following the procedure above, you may now go to your SAP Datasphere and proceed on configuring the Local table created from our generated artifact in SAP PaPM Cloud Universal Model.

created object in datasphere.png

You may also create a View on top of this object created for this to be exposed on SAP Analytics Cloud.

Expose View to SAC.png

For more information, Consume SAP Datasphere OData in SAP Analytics Cloud via an OData Service .

Once the view is exposed, you may use this consumed data from SAP Datasphere to in the analytical reporting tools of SAP Analytics Cloud.

Processed Data for Annual computation.png

That’s all, I hope you learned another integration that you can use in your analytical workflow from using the SAP PaPM Cloud together with sap-datasphere-exhibitor.

See you again on the next integration topic!