Financial Management Blogs by SAP
Get financial management insights from blog posts by SAP experts. Find and share tips on how to increase efficiency, reduce risk, and optimize working capital.
Showing results for 
Search instead for 
Did you mean: 

If you have seen the previous SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) integration blog posts and you have found it interesting, then this blog post might be your cup of tea too!  

Google Big Query is a Cloud Data Warehouse that is being used to store data records. By exploring different possibilities on SAP PaPM Cloud, we are going to discuss in this blog post how we can consume tables coming from Google Big Query and use it as part of the calculation in SAP PaPM Cloud. 


One (1) table is created from Google Big Query and will be connected via Big Query REST API remote source and consumed the data records by saving the Virtual table to an existing schema. 

To give an overview on what I am stating, see the diagram below: 



  1. Create a table from Google Big Query
    ore information on this link Create and Use tables

  2. Service Account and Private Key
    More information on this link Create Service Account and Private Key 


Step 1 - Create the Service Account and generate a private key for the required credentials in Google Big Query.

1a. Service Account:
Adapter for SAP HANA Cloud remote source is Google Big Query REST API. For the technical users, it needs the account name as one of the requirements for credentials on creating Remote source.

More information on this link Create a service account.

1b. Service Private Key:

Private key for the service account can be downloaded in JSON format. This will also be required as one of the requirements in credentials for the creation of Remote Source.


More information on this link Create and manage service account keys 

Step 2 - Create the Remote Source

In this blog post, we will use the Remote Source as the connection between SAP PaPM Cloud and Google Big Query via Big Query (REST API).

2a. Add the Remote Source

2b. Put the Source name that you desire 
2c. Choose the BIG Query (REST API) as Adapter name. 
2d. Keep the default Source location (IndexServer) as it is a default for SDA adapters. 
2e. Keep the default as a server 
2f. Choose Technical User as a credentials mode

2g. Fill up the account name with the client email from downloaded JSON file. 
2h. Fill up the private key with the values from downloaded JSON file. (Users should copy all of the values inside its the quotation marks) 

Step 3 - Create the Virtual Objects 

Connected Remote source will allow you to create a Virtual Objects for existing tables coming from Google Big Query.

3a. Choose the table.
3b. Choose Create Virtual Objects

3c. Put the name that you prefer. 
3d. Press Create

With the creation of Virtual Objects, it is expected that it is already listed under the targeted schema and the Catalog Tables.

Table is now consumed successfully. You can now open and check the data records of the created virtual objects.

Employee Details table (IN SAP HANA Cloud):

Consumed table from Google Big Query:

Step 4 - Create a connection and use the virtual Objects created from the previous steps 

Connection is created with the schema and HANA table that was used from previous step. 

More information on this link How to Create a Connections

Step 5- Utilize the connection with the Model Table HANA 

To use the connection created from previous step, users need to create the Model Table HANA.


5a. Create a new Model Table HANA
More information on this link General Procedure on how to Create a Functions 

5b. Choose the connection name created from the previous step 
5c. Map all corresponding fields
More information on this link Procedure for Model Table HANA

You can also check the data records in show screen that it was successfully consumed in SAP PaPM Cloud modeling perspective. 

Employee Details table:

Step 6 Combine the consumed table from Google Big Query and use it as one of the inputs combining to one local Model Table from SAP PaPM Cloud (Optional) 

6a. Let’s create first a Model Table coming from local SAP PaPM Cloud:

More information on this link Procedure for Model Table 

Salary Table:

6b. Use both Model Tables as Input for Join Rules (Let’s use JOIN function for this example) 

More information on this How to create a JOIN function

6c. With the successful activation and execution of the Join Function, it will combine our records with the Inner Join rule. Therefore, in show screen, here are the results:

Note: Users can also use a different Calculation and Processing function depends on the use case needed 

That’s all! We all know that integration is a broad topic, but I hope this simple blog post will provide information that Google Big Query data records can be used as input or data source of the complex calculations, rules, and simulations in SAP PaPM Cloud. 

Have a good day and see you on the next blog post via SAP PaPM Cloud Community Tag.