There are situations where one might need to run some backend calculations from a story in SAP Analytics Cloud (SAC). For example, when we want to execute complex logic or calculations to data or parameters that have been updated.
In this post I will show how to use the multi-actions in SAP Analytics Cloud to call a Stored Procedure residing in SAP HANA Cloud or SAP Datasphere (DSP).
For this purpose I will use the recently published blog about how to create an API in Cloud Foundry to call Stored Procedures in SAP HANA Cloud, using the SAP Clo.... In that post I created a stored procedure that adds a record to a table CALL_HISTORY with the current timestamp and some additional information. The same procedure will be used here, but deployed in SAP Datasphere. I will use the Multi-Actions in a SAP Analytic Cloud story to call that API. A view will be created in SAP DSP to display the content of CALL_HISTORY in the same SAC story.
The HANA stored procedure will be deployed in a HDI container that can reside in SAP HANA Cloud or within SAP Datasphere. This container will have the HANA native artefacts: table and procedures.
On top of that, SAP Cloud Application Programming model will be used to provide the API service as well as the user authentication in Cloud Foundry.
In the next sections I will used the name DWC_API_Service_Test for the application and services, which is exactly the same as the HANA_API_Service_Test created in that blog, but pointing to a HANA Cloud instance within SAP Datasphere, as described in the next option.
The BTP space needs to be connected to the SAP DWC tenant.
The HDI container needs to be deployed in the HANA Cloud instance of the SAP DWC tenant.
The HDI content (table and stored procedures) needs to become accessible from SAP DSP by adding roles to the project. These roles should provide SELECT and SELECT METADATA privileges to read the table and EXECUTE privilege to run the store procedures.
In the API URL, the application URL is written with the addition of /catalog/
All parameters should be in the body:
"comment": "Calling from SAC",
For the execution results, I selected “Synchronous Return”, and I saved it
SAP Analytics Cloud - Multi-Action API Step
View in SAP Datasphere
In order to access the table CALL_HISTORY, the HDI container needs to be added to the SAP DSP space:
SAP Datasphere - Space Management
Once the HDI container is added, a graphical view can be created in the Data Builder. When creating a view, the table should be visible by going to sources. I created an analytical view with VALUE as a Measure in order to consume it from SAP Analytics Cloud.
SAP Datasphere - New Graphical View
After its deployment, it should be accessible from SAP Analytics Cloud.
Story in SAP Analytics Cloud
In SAP Analytics Cloud, I am going to create a basic story that connects to SAP Datasphere. After creating the new canvas, I added data from data source, in this case live data from SAP Datasphere.
SAP Analytics Cloud - Data Source
In this example I just created a table to display the data and the Multi-Actions. The Multi-Action appears in the insert options as a Planning Trigger.
SAP Analytics Cloud - Insert Multi-Action
In the Multi-Actions settings I selected the Multi-Action name previously defined.
Then I added a table to display the View_CALL_HISTORY from SAP DSP.
SAP Analytics Cloud Story - Table configuration
Now, by clicking on the Multi-Action control, the API will be triggered. Once its execution finished, the data can be refreshed by clicking the Refresh button in the toolbar.
SAP Analytics Cloud story - Refresh data
This blog post showcase an example on how to trigger the execution of a HANA Stored Procedure via API from SAP Analytics Cloud. The Stored Procedure can reside in SAP Datasphere as well as in an independent SAP HANA Cloud.
In this way, business users can execute logic in the backend directly from a report. Therefore, more complex use cases can be implemented. For example, Monte Carlo simulations leveraging the SAP HANA Predictive Analysis Library (PAL) could be triggered directly from SAP Analytics Cloud.
I want to thank ian.henry, maria.tzatsou2 and nekvas75 for all the discussions, shared experiences and received support in relation to this blog post!