Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Vitaliy-R
Developer Advocate
Developer Advocate
2,723
I recently seen that question about connecting to SAP HANA Service in the trial account of SAP Cloud Platform Neo environment from SAP Data Hub, Trial Edition.

Two remarks before we proceed with this post:

  1. SAP Data Hub trial is a cluster solution based on Kubernetes. I plan to get back to it later, but here I am focusing on a Docker-based (and therefore a single host) developer edition of SAP Data Hub.

  2. HANA Service in Neo is meant to be a data store bound to applications running in the SAP Cloud Platform, not a full stand-alone SAP HANA database. But it is still Ok for the sake of the current exercise.


With these two points in mind let's go now. I assume you do have both - SAP HANA Service in SCP Neo and SAP Data Hub dev edition - running already .

Download the Console Client for the Neo Environment


In order to be able to access your SAP HANA Service in Neo environment you need to open a tunnel to it from the external host, like your Docker container with SAP Data Hub. And in order to open that tunnel you need SAP Cloud Platform console client, which is a part of the SAP Cloud Platform SDK for Neo environment.

This SDK is normally downloaded from https://tools.hana.ondemand.com/#cloud. But we need to download it from the command line in the container, so let's use the Maven repository http://central.maven.org/maven2/com/sap/cloud/neo-java-web-sdk/

First let's connect to the interactive terminal in our datahub container.
docker exec -it datahub bash

Now that we are "in the container" (as many people say) let's download the SDK version 3.78.15 for Java Web Tomcat 8 (information taken from https://tools.hana.ondemand.com/#cloud at the moment of writing, you need to check the current version at the moment of reading this) to /tmp/Downloads/ folder.
cd /tmp/Downloads/
curl http://central.maven.org/maven2/com/sap/cloud/neo-java-web-sdk/3.78.15/neo-java-web-sdk-3.78.15.zip -o neo-java-web-sdk.zip


Uncompress the SDK


Now that the zip file with the SDK is downloaded we need to uncompress it.

Docker containers are usually "slim", i.e. only the minimal set of packages and utilities installed by default. In our case we need to install unzip before. openSUSE is the OS used in the container, so we use zypper as a package manager.
zypper install unzip



Now, let's uncompress it to /tmp/neo/ folder.
unzip neo-java-web-sdk.zip -d /tmp/neo

Open a tunnel to SAP HANA instance


Check parameters required to open a tunnel.
cd /tmp/neo/tools/
./neo.sh help open-db-tunnel

To connect to my SAP HANA instance we need an account id (in my case it is i076835trial, i.e. 1 on the screen below), a user id (i076835 as can be found under 2), a password of the cloud user (sorry, no plans to share here 😉 ), and an instance name of SAP HANA (in my case it is mymdc, i.e. 3 on the screen).


./neo.sh open-db-tunnel -h hanatrial.ondemand.com -a i076835trial -u i076835 -i mymdc

I need to type my cloud user's password. You can always include it into the command line with the -p option; plus you can open a tunnel in the background with --background option.



Thanks to the opened tunnel processes in the Data Hub container can see the remote SAP HANA instance as the local one running at the localhost on the port 30015.

Build the graph in the Data Hub's Modeler connecting to the HANA instance


Now, let's go to the Data Hub's Modeler and build the graph connecting to the SAP HANA instance using localhost as a host and the DB user/password.

Here is the sample graph. To use it in the Modeler:

  1. create a new graph,

  2. switch to the JSON view,

  3. paste the following definition,

  4. change the myPassword password to the one of your system DB user.


{
"description": "Connecting to SAP HANA in SCP Neo Trial",
"processes": {
"constantgenerator1": {
"component": "com.sap.util.constantGenerator",
"metadata": {
"label": "Constant Generator",
"extensible": true,
"config": {
"content": "SELECT * FROM M_DATABASE"
}
}
},
"saphanaclient1": {
"component": "com.sap.hana.client2",
"metadata": {
"label": "SAP HANA Client",
"config": {
"connection": {
"connectionProperties": {
"host": "localhost",
"password": "myPassword",
"port": 30015,
"user": "system"
},
"configurationType": " "
}
}
}
},
"terminal1": {
"component": "com.sap.util.terminal",
"metadata": {
"label": "Terminal",
"ui": "dynpath",
"config": {}
}
},
"tostringconverter1": {
"component": "com.sap.util.toStringConverter",
"metadata": {
"label": "ToString Converter",
"config": {}
}
}
},
"groups": [],
"connections": [
{
"src": {
"port": "out",
"process": "constantgenerator1"
},
"tgt": {
"port": "sql",
"process": "saphanaclient1"
}
},
{
"src": {
"port": "outstring",
"process": "tostringconverter1"
},
"tgt": {
"port": "in1",
"process": "terminal1"
}
},
{
"src": {
"port": "result",
"process": "saphanaclient1"
},
"tgt": {
"port": "ininterface",
"process": "tostringconverter1"
}
}
],
"inports": {},
"outports": {},
"properties": {}
}

Now switch back to the Diagram view of the chart, and click on the Auto Layout to get the better layout of the components.



Save and start the graph. It should execute the following SQL statement once:
SELECT * FROM M_DATABASE

The result should be visible in the Modeler's Terminal UI, like in this example.


That's all for now


Any ideas from your side how to improve these steps? Or maybe you want to share how to do the same in the Kubernetes environment of SAP Data Hub Trial?




-Vitaliy, aka @Sygyzmundovych
4 Comments
prodyut
Participant
0 Kudos
Hello Vitaliy,

Thanks for the great blog. I was trying to get my hands dirty on DataHub. We have a datahub installation running on AWS cloud and we are trying out multiple scenarios like connecting S3, ECC, MS-SQL, etc with DataHub and then pushing the data to HANA on Neo cloud (SCP Neo).

We are able to connect S3 and read S3 bucket object from DataHub. Now, what would be the steps for connecting & pushing data to HANA on cloud from DataHub, is what trying to explore.

It would be great if you can give us some guidance or share some help link on the same.

 

Cheers,

Sen
Vitaliy-R
Developer Advocate
Developer Advocate
0 Kudos
Hi Sen.

Because your question is not a comment to the blog, can you please port it in the proper Q&A forum: https://answers.sap.com/tags/73555000100800000791

This way more people in the community can contribute and benefit from the question and answers.

Thank you and regards,

-Vitaliy
prodyut
Participant
Thanks Vitaliy for the right direction. Have done the same.

Cheers,

Sen
Great article Witalij!!!

It helped me a lot to make it work in a scenario with a HANA Service in Neo where I needed to update data from Power BI. It has the 24 hours limit but I will solve it with SAP Cloud Connector so everything is all right.

Kind regards,

Fernando