Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
Predictive Asset Insights(PAI) provides the capability to extend SAP Analytics Cloud(SAC) analytics based on master data as well as operational data. Live embedded SAC analytics enriches the analytical capability of PAI giving better insights to end users.

Cloud Application programming(CAP) accelerates the development of full stack applications, specifically in developing artifacts the help integrate with SAC to provide Live analytics.

PAI provides extensive documentation on the prerequisites and steps to enable Live analytics.

High Level Architecture

Embedded Live PAI analytics in 10 Steps

      1. Request to expose analytics database as explained here.

      2. Create service instance of Master Data Analytics Business service

Master Data Analytics Service instance and key


      3. Create a database instance in your own sub-account as explained here.

      4. Establish remote connectivity to IAM analytics database.

    • Import SSL certificates as explained here

    • Add remote source with the below properties, using the values from the service key of Maser data analytics.

Remote database connection

             5. Access relevant remote tables and create virtual tables in your own schema.

Remote database artifacts

Select the relevant objects and create virtual table in your own schema.

    6. Create a technical user with privileges to grant schema privileges to HDI container users.
create user <username> password <your password> set parameter client = '001' set usergroup default;
alter user <username>disable password lifetime;
grant select on schema <your schema> to <username> with grant option;
grant select metadata on schema <your schema> to <username> with grant option;
create role <role name>;
grant select, select METADATA on schema <your schema> to <username> with grant option;
grant <role> to <username> with ADMIN OPTION;

    7. Develop HANA artifacts in Business Application studio by accessing the virtual artifacts.

    • Start by creating a space in Business Application Studio(BAS) for full stack development.

    • Once created, create a project using the templates provided and choose CAP project and follow the guided UI by giving in the necessary project details.

CAP project

You can choose 2 runtime options, either nodejs or JAVA if you are also intending to expose data via a service.

The yeoman generator will create some initial artifacts.

  • Adjust CAP project for HANA development

    • Install hana-cli

    • Replace gen/* in path parameter and point it to your project folders.

    • Adjust package.json

"cds": {
"build": {
"target": "."
"hana": {
"deploy-format": "hdbtable"
"requires": {
"db": {
"kind": "hana"


For more details refer.

  • Create your HDI service instance and service key to work with BAS and also for further  deployment. This will also create you a default-env.json which will help with local development.

cf create-service hana hdi-shared <your service instance name>
cf create-service-key <your service instance name> default
hana-cli servicekey <your service instance name> default

  • Add additional scripts to your package.json in the root folder. This will copy your local environment variables to each module in your project. In the below example to db and srv modules.

 "scripts": {
"start": "cds run",
"env": "cp ./default-env.json ./db/default-env.json && cp ./default-env.json ./srv/default-env.json",
"build": "cds build/all --clean"

Now we have our environment set up. We need to create database artifacts (Calculation views) using the virtual tables created earlier. However these virtual tables are in a specific schema and hence we will need to access the artifacts across schemas.

Cross schema artifact access

In order to achieve the above, we need a technical user, which we created in Step 6.

  • Create a .hdbgrants file under cfg folder in the db module of your project.

"<service name>": {
"object_owner": {
"roles":["role name - created in step6"]
"application_user": {
"roles":["role name - created in step6"]

  • Create a user provided service using the user created in Step 6.

cf cups CROSS_SCHEMA_SERVICE -p synon.json

synon.json as follows:
"driver": "",
"password": "<password for technical user>",
"schema": "<your schema which has virtual tables>",
"tags": [
"user": "<technical user>"

  • Add the following to VCAP_SERVICES of your default-env.json

 "user-provided": [
"binding_name": null,
"credentials": {
"driver": "",
"password": "<password for technical user>",
"schema": "<your schema which holds virtual tables>",
"tags": [
"user": "<technical user>"
"instance_name": "CROSS_SCHEMA_SERVICE",
"label": "user-provided",
"syslog_drain_url": "",
"tags": [],
"volume_mounts": []

  • Add the following Service replacements as well to default-env.json


  • Update your mta to reflect the same dependencies

- name: hdi-db-deployer
type: hdb
path: db
- name: hdi-db
TARGET_CONTAINER: '~{hdi-service-name}'
- name: cross-container-service-1
key: hdi-service-name
service: '~{hdi-service-name}'
- name: iam-analytics-cross-schema
key: iam_cross_schema
service: '~{iam_cross_schema}'

- name: hdi-db
service: hana
service-plan: hdi-shared
hdi-service-name: '${service-name}'
- name: iam-cross-schema
type: org.cloudfoundry.existing-service
iam_cross_schema: '${service-name}'
- name: cross-container-service-1
type: org.cloudfoundry.existing-service
service-name: hdi-db
hdi-service-name: '${service-name}'

**NOTE: The mta is not complete, it is a subset highlighting hdi service relationships and dependencies

Once all the dependencies are updated, run:

cds build

You can deploy the artifacts either by running npm start on db module or use the BAS UI to bind the database connection:

Now create Synonyms (.hdbsynonym file) to access the virtual tables. You will need to bind the cross schema service to your database project to be able to access the objects.

Create calculation views (CUBE) either just projections of the tables(via synonyms) or create scripted table functions and add them to the views. Deploy the artifacts as previously done.

             8.  Add a service layer to integrate with applications (optional).

When we created the CAP project, it also created an srv module along with it. We will create a cds artifact in our db module and then a service in srv module to expose the cds view as an odata service.

  • Use hana-cli to generate the cds entity.

            hana-cli inspectView -v SampleIndicators -o cds

Sample Entity output

  • Create a .cds file in db module and add this generated code. Make sure to add the following annotations too.


  • Create another .cd file in srv module and create a service to expose the created cds view.

using Equipment from '../db/data-model';
using SampleIndicators from '../db/data-model';

service IamAnalyticsService {
@readonly entity EquipmentList as projection on Equipment;
@readonly entity SampleIndicatorList as projection on SampleIndicators;

Run cds build, npm start on db or deploy it from the HANA projects view as shown above. 

Run npm start on root. This will start the service on a CDS server.

For more robust routing and authentication you can also add either a standalone or managed approuter module / configuration.

        9. Create HANA Live connection with your SAC tenant.

In order to create connection, you can either create a new or use existing service key of the HDI container service instance.

HANA Live connection in SAC

                 10. Create Live Models and build dashboards.

Exemplary Live dashboards

Now the dashboards / stories can be embedded into PAI to provide a unified user experience as explained here.


With the above approach, extending PAI analytics with live embedded capability can easily be accelerated providing quick turnaround time from an implementation perspective and as a consequence providing enriched and real-time insights for customers or asset operators.

Also refer:

Creating calculation view and expose via CAP

Hana Cloud and CAP

PAI Analytics database

PAI Analytics blog