Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
A recent innovation introduced into SCP Internet of Things for Cloud Foundry allows to connect an HANA cloud database to your tenant by using the of Bring Your Own Database feature.

In simple words it consists in the definition of a custom Processing Service in your tenant of your instance of SCP Internet of Things, that can forward the ingested devices data to an external SQL database

The existing configurations currently support HANA.

In this blog post and this youtube channel, Philip has already explained how to connect an HANA container (hdi-container) to a tenant of SAC.

In the last weeks, I and my colleague gconte91 have decided to extend Philip’s tutorial in order to use IoT data ingested in a “classical” HANA database on Cloud Foundry into a connected SAC.

This is the connection Schema




There are several requirements that need to be satisfied if you would like to use your IoT data into SAP Analytics Cloud: some software that runs in your local machine and something in your Cloud Platform account.

Let’s start with your local machine.

  1. To complete the setup, you need to invoke some commands from the command line or terminal to you SAP Cloud Platform Cloud Foundry account by using the Cloud Foundry CLI. You can easily download it from

  2. In this integration we are planning to deploy a multi target application on Cloud Foundry you will also need to download the Cloud Foundry CLI MTA. It’s currently available here:

  3. The application will be built on your local machine, so you need to download the Multi-Target Application (MTA) Archive Builder from . Select the jar File

  4. In the HANA section of the same website ( you can download the Analytics Adapter

  5. To be able to build the MTA application a java JDK8 it’s required on your machine, for example, SAP JDK or any other commercial JDK.

  6. A maven installation is also required for the same reason (currently the stable version is 3.6.0):

  7. If you are using windows, install git for windows: or use the following guide Windows Subsystem for Linux Installation Guide for Windows 10.

  8. The Cloud Foundry application is currently using nodejs and npm, so you need to download and install them from .

  9. After the installation of npm you need to ensure that is linking the SAP registry: open a terminal window and configure it with the following command
    npm config set @sap:registry​

  10. Let’s assume that you have already configured your Cloud Platform account with inside one subaccount of type Cloud Foundry and another of type Neo.

  11. In the Cloud Foundry subaccount you should have already configured a space and deployed an instance of SAP Cloud Platform Internet of Things and an instance of HANA as a service.

  12. In the subaccount NEO an instance of WebIde Full Stack should have already been configured to reach the Cloud Foundry subaccount services.

  13. In addiction, you already have a SAML 2 Identity Provider (IdP), that could be SAP or 3rd party. In the current example, we are using the SAP IAS (Identity Authentication Service) also known as SCI Tenant.

  14. Finally, you will need an instance of SAC with role access as System_Owner.


Configure HANA users, grants and BYOD


In this step, we are going to connect HANA to your instance of SCP Internet of Things. It’s a good practice to use a different user and not SYSTEM, so we are creating some technical users and roles that will be used by the different components.

Open the HANA console and click on Open the SQL console

In the SQL console create some new users and roles:
CREATE ROLE "IOTDATA::external_access_g";
CREATE ROLE "IOTDATA::external_access";
GRANT "IOTDATA::external_access_g", "IOTDATA::external_access" TO IOTUSER_GRANTOR WITH ADMIN OPTION;

We are creating IOTUSER that is used into IOT Service, it automatically creates a schema named IOTUSER and a grantor (and the roles) required to grant external access to the hdi-container that will be generated in the next steps

In the Messages list, a confirmation message informs you about the success of the operation.

Verify that your HANA IP can be reached by IoT Service. Check the IP Whitelist setting, it should be as in the example

Now it’s possible to configure a new SQL Processing Service in SAP Cloud Platform Internet of Things, as described in the official documentation

Get the required information with the Cloud Foundry command line

C:\>cf api
Setting api endpoint to

api endpoint:
api version: 2.136.0
Not logged in. Use 'cf login' to log in.

C:\>cf login
API endpoint:



Select an org (or press enter to skip):
1. 5d4eb5da-e390-49cd-b10f-8257eb559ec2_iot-iottrainingdev-709f
2. iotpm
3. IoT PM_iotpmdemosenv
4. SAP Leonardo IoT Demo PM_iotdemo

Org> 3
Targeted org IoT PM_iotpmdemosenv

Targeted space demos

API endpoint: (API version: 2.136.0)
Org: IoT PM_iotpmdemosenv
Space: demos

C:\>cf s
Getting services in org IoT PM_iotpmdemosenv / space demos as

name service plan bound apps last operation broker upgrade available
demos portal standard create succeeded portal
hana128 hana-db 128standard create succeeded dbaas-broker
iotpm iot standard create succeeded iot-broker
iotpmae iotae standard create succeeded iotae-broker

C:\>cf create-service-key hana128 keys
Creating service key keys for service instance hana128 as

C:\>cf service-key hana128 keys
Getting key keys for service instance hana128 as

"certificate": "certificate_here",
"driver": "",
"host": "",
"port": "23303",
"url": "jdbc:sap://\u0026validateCertificate=true"


In the last part of the keys, you have the url connection string.

Remove the last part:


and paste it into the form to create a new Configuration of the Processing Service. Use the IOTUSER created in the step before for the credentials

The address in this example will be


Once finished, create the Selector

Inside the Configurations box selects the configuration created in the previous step.

The incoming data is now forwarded to your instance of HANA.

You can start ingesting data from your devices into the current tenant.

If you would like to verify that the operation has been completed correctly open the SQL console with the credentials of IOTUSER.

Open the database, then go to Tables and, if at least one device has already started to send data, check that exists some tables (one for each capability of IoT Service) and some data.

Open the SQL console to give the grants and permit the external access for the newly created schema

Create a user-provided service to connect some remote data that is not in the HDI-container within the Cloud Foundry CLI:
cf cups iotuser-schema -p "{\"user\":\"IOTUSER_GRANTOR\", \"password\":\"Welcome1\", \"tags\":[\"hana\"], \"schema\":\"IOTUSER\"}"

The OK message returned by the CLI confirm the success of the operation. You can verify it also from your sap cloud platform cockpit applications


Build your hdi-container


In this step, we are going to build a new container (hdi-container) to be used by Hana Analytics Adapter, and a calculation view to create data for SAP Analytics Cloud.

The build and the deployment of the container are made within WebIde Full Stack, but it could be done within the command line.

Open WebIde and install SAP HANA Database Development Tools from the menu Tools click Preferences and go to Extensions.

Update also the Cloud Foundry API endpoint and space to permit the deployment of the container on your Cloud Foundry account in the right destination.

Save and close this window to go back to the workspace and create a new Project from Template.

In the next windows create a new Multi-Target Application for the Cloud Foundry Environment

In the next step type a name for the project and complete the wizard.

In your workspace has been created a new project, with the right click of the mouse create a new HDB module for the project and type a name for it

Then complete the wizard with the default settings.

The first operation that needs to be done is the update of the mta.yaml file; add the missing parameters and requirements accordingly with the following template (you don't have to replace the content inside the graph brackets)
ID: iot2sac_app
_schema-version: '2.1'
version: 0.0.1

- name: iot2hana_db
type: hdb
path: iot2hana_db
- name: hdi_iot2hana_db
TARGET_CONTAINER: '~{hdi-container-name}'
- name: iotuser-schema

- name: hdi_iot2hana_db
hdi-container-name: ${service-name}
- name: iotuser-schema
type: org.cloudfoundry.existing-service
service-name: iotuser-schema

it’s time to build the created container. The build is creating an hdi test container into your space

No error in the console logs can confirm that the deployment of the test (modeling) application has been successfully deployed.

If you have just one HANA instance into the space, the connection between the HDI and HANA is made automatically, otherwise, you have to specify the GUID of the database into the mta.yaml


Create a new file (with the extension .hdbgrants) to connect to the user defined service (iotuser-schema) and assign the roles to the technical user of the hdi-container.
"iotuser-schema": {
"object_owner": {
"roles": [
"application_user": {
"roles": [

Make a build of the single file to update the modeling/test application

If no errors are logged in the console it means that the operation that is successfully (you should also see in the log something similar to Using service "iotuser-schema" of type "sql")

Create a new file into the source folder to create the synonym iotdata.hdbsynonym, and fill it within the code editor. Don’t use the graphical form to setup the content of the file.
"iot2sac_app.iot2hana_db::iotdata": {
"target": {
"object": "MEASURES_DCB554E3-A817-4FDC-A2DA-554BB3C1437B",
"schema": "IOTUSER"

Build the newly created single file to deploy the synonym in test mode

No error in the console confirms that the operation is successful.

Till now we have created the user grant and the synonym that allow the hdi container to have access to a classical HANA table (that is currently created by Internet of Thing Service and contains all the measurements); to check it right click on the **_db folder, the parent folder that is containing src, and press Open HDI Container

In the next window click on Synonym of the test container and press Open Data with the right click of the mouse.

You should be able to see the IoT measurement forwarded by the custom data processor from Internet of Things to HANA into the hdi-container

We have verified that the data is currently inside the hdi-container.

In this current version of SAC, the analytics is made as a Live connection by using the calculation views of HANA, so we need to create a new calculation view into this project and container.

Right click on src then New and finally Calculation View, to create a new Calculation View

In the next window just provide a name and leave the other property with the default value, like in the following image

Open the newly created view and press the plus button that appears on the right of the box when you select the Aggregation box.

In the modal window do not select any External service in the dropdown list and search for it with **; then select the element(s) of the list and press Finish.

In the next step, we need to define the properties used for the aggregation, double click on the Aggregation box and in the shown widow complete the matching.

Once completed we can setup the Semantics; double click on its box and select what are the measurements involved into the aggregation and the type of aggregation that needs to be generated.

The calculation view is now ready to be deployed in test mode: build the single file of the Calculation View

After the build is completed we can check that the aggregation is built correctly: click the database visualization icon in the blue bar on the left of your window, then go to Column Views.

In the list of the column right click on the calculation view table (iotdatacv) and press Open Data.

In the opened window select the measure(s) that has been aggregated to verify that the aggregation is computed correctly; in this example it’s C_TEMPERATURE

The development of the hdi-container is now completed; right click on the project tree and build the entire project, to generate the mtar file that we are going to deploy to Cloud Foundry

Right click on the mtar file then click on Deploy and finally on Deploy to SAP Cloud Platform, to deploy it in non-test mode.

A notification will inform you that the hdi-container has been deployed successfully.


HANA Analytics Adapter deployment


The integration between HANA and SAC is made through the official Hana Analytics Adapter. It’s publicly available in this github:

Download it as zip file and extract it into a folder on your local pc and put the mta_archive_builder-1.1.19.jar file, that you have downloaded in the requirement section, in the root folder of the HAA project.

Uncompress, that is another file downloaded in the requirements, and copy the file xsahaa-1.5.6/java-xsahaa.war into the folder <haa project folder name>/haa-java/ target

Before deploying the adapter we need to edit mta.yaml to configure the deployed services.

Replace the <hdi-container> string with the name of hdi-container you have deployed to Cloud Platform within WebIde in the step before. You can find his name in SAP Cloud Platform Cockpit under Applications

Replace also <sac-host> with your SAC address.

In case you don’t want to have your adapter multitenant enabled remove the following lines from the yaml file

  • SAP_JWT_TRUST_ACL: '[{"clientid":"sb-haa-java", "identityzone": "*"}]'

  • TENANT_HOST_PATTERN: '^(.*)-<space>-haa.cfapps.(.*)'

In case you are not using multitenancy, it’s also required to modify the file xs-security.json: change to dedicated the value of tenant-mode

The configuration is now completed, and we can build it in a new bash window with the command
java -jar mta_archive_builder-1.1.19.jar --build-target=CF build

When the build is completed, it’s time to deploy it to Cloud Foundry with the command
cf deploy <haa project folder name>.mtar

that for our example is
cf deploy hana.mtar


Create trust between CF and SAC


Currently, SAC is not part of the Cloud Foundry account and it’s not able to trust a priori the data connection with HANA. The authentication used by SAC is SAML SSO 2.0 and must be in common with the SSO of your Cloud Foundry account.

For this reason, we need to use a 3rd party identity provider to establish communication between HANA and SAC.

In this example, we are using an SAP Identity Authentication Service, also known as IDP or SCI tenant, but you can use an also different non-SAP product in case you already have your own.

Open your IDP and download the SAML 2.0 metadata from Tenant Settings, under Applications & Resources.

With the next step we are importing the metadata file into the Cloud Foundry account: Open your Cloud Cockpit and go to Security, then Trust Configuration and create a New Trust Configuration.

We have imported the metadata into CF, now we need to do the opposite, that means to download the metadata from your cloud account and import them into the identity provider.

Open in a new tab of your browser the following url:


that in our example is:

Go back to your IDP service provider configuration and create a new SAML application

Upload the metadata file retrieved from your Cloud Platform account to import the configuration into your Identity Provider

From the main menu set the Subject Name Identifier as email, then save the configuration.

Now Import the IDP metadata into SAP Analytics Cloud

Open the main menu and go to System then Administration

Select the Security tab and in the new tab press the edit button.

First, it’s required to enable SAML Single Sign-On (SSO).

This operation can be only done by the System_Owner. If you are not allowed to modify the settings, it means that you are not the owner of the system.

Follow and complete the 4 steps, and at the end, the UI will look like the one in the following image

Press Verify Account to complete the setup and then just save the changes with the icon in the top right of the page.

The next operation is required to give the roles to your Cloud Foundry account and, in particular, to the custom Identity Provider

Open your Cloud Platform Cockpit, go to your subaccount Security, then Role Collections and create a New Role Collection.

As name use, for example, HAA_USER; open the newly created role collection and add into the collection the service created by the deployed application.

Assign the role collection HAA_USER as Trust Configuration for your user into your custom Identity Provider

Complete the configuration on the HANA Cockpit: click Manage Roles option

To have all the required grants, plus button to create a new user. It is suggested to use the same name of the role (HAA_USER)

Save the user and then click Privileges. Edit the existing privileges and press Add Object

Select in the list EXECUTE_MDS with the Object Type PROCEDURE and Schema SYS

Select the privilege EXECUTE

Press OK and Save the modification in the main windows.

Open a cmd window or terminal and get the environment for the deployed app
cf env haa-java

There are different information in the response about the deployed app and the created HANA schema.

This schema needs to be granted, so search in the output of the cf command for a string named schema and copy it into the clipboard.

Inside HANA cockpit go to Roles, press Edit and the button Add.

In the next windows search for the schema string you have copied in your clipboard.

Select both the roles then press OK and Save the modifications.

In the next step, we are creating a new user to grant your IDP and Cloud account to have the right grant and access to HANA.

You can use any name for it; use JWT authentication and configure the External Identity with the same email used in the Cloud Platform account under the Trust Configuration, Role Collections assignment, with a non-automatic mapping.

Assign to the created user the role HAA_USER and the privileges

The trust is not yet established. To complete it, you have to apply the Note 2470084

In particular we are interested in the solution b (No trust has been established between XSUAA and the Hana Database), subsection b (Sap Cloud Platform)

In a nutshell, it means you have to download the following patch that guides you to autoconfigure a sql file for your account and finally apply the patch with HANA SQL Console. Here a direct link to the zip file containing the patch.

To prepare the SQL file, uncompress the archive and in the same folder type within a terminal (it requires Linux function for windows or git bash):
./ -a https://<subaccount_name>.authentication.<region>

That for the account used in this example means
./ -a

This process generates in the same folder a file named xs_appuser.sql

Edit the file with a text editor, copy the content of the file in the clipboard and paste and execute it in a SQL console of HANA.

In case you have some notification like the following, don’t worry, just ignore all the warning and errors

Check the configuration and create an SAC connection

Before the creation of the SAC connection, you might like to check if the HAA is working as expected.

Open the following URL in a new tab of your browser


that in this example is

In case you have Forbidden or SQLInvalidAuthorizationSpecExceptionSapDB, it means that there is something mess with the authorizations and the created users.

If you have obtained success in the invocation of the adapter, it means you would be able to create the connection on SAC.

Open SAP Analytics Cloud and in the main menu go to Connection to create a new Live connection type HANA

Give a name to the connection and as address use your haa application address, that in this example is

You can find it directly in your Cloud Foundry subaccount under the Applications in the page of your application, or easily with the cf apps command within a terminal or cmd window:

Use the port 443 and SAML Single Sign On and press OK.

If Single Sign-On is well configured, the browse should be open a prompt for the sign on and then the connection created.

The connection has now been created, it means we are finally able to communicate with HANA. Now we have to create a new model based on the Calculation View.

Open the menu go to Create and then Model

We are going to use a datasource to create the model, so we have to select the option Get data from datasource, and we are going to create a Live Data connection type

In the next window you have to specify that this is an HANA system, and in particular select the connection you have created within this example; finally, specify the calculation view that defines the model

A new window with the measurement type is opened; after a review on the model, you can save it with the icon in the bar.

As last step, you can create your own Dashboard and Story on SAC and bind it to the saved model.

We have created the following simple dashboard in this example, with the purpose to test that everything is working as expected




You should have learned with this blog post how to connect SAP Cloud Platform Internet of Things to HANA, how to create a synonym to into an hdi-container to a classic HANA table, how to link your HANA hdi-container to an SAP Analytics Cloud tenant and how to configure the trust between the different components.

These operations all together with permit you to have your IoT data available into SAC, to create your analytics over them.

There are a few possible improvements.

In the section where we were creating the hdi-container we can also fetch data from different SAP system (such as ERP), forward them to HANA and make a join between the tables to add as part of the analytics the IoT measurement and the related joined data.

I've made most of the operation manually and by using UI and SAP tools (WebIde), but most (or all) of the things shown in this post could be easily automated within the invocation of several APIs and SQL command within a set of scripts. I've not yet created the script, but it's another possible improvement.