Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 


Setup SAP Cloud Platform: SAP Search API & Base Twitter API setup

Collecting Community Data and Tweeting from the SAP Cloud Platform

Using Web IDE with the Collected data as an OData Service

Analysis Of The Collected Data With SAP Lumira

Links to Individual Detail sections


I have a history of collecting SAP Community data and blogging about it here. It has been mainly driven by my general interest in SAP technology. My favourite out of these community based blogs was the  Thematic mapping of SDN Points and asking community members to register their location on a HANA SAP Cloud Platform map (they both no longer work due to changes on SDN, SCN, HANA Cloud Platform) . However I have decided to again collect data from the SAP Community and this time via the SAP Cloud Platform. It was made possible by what lies behind something that you (and I) may not use a great deal but a link to it is in the top right of this page. As shown in the following image.

Onedx Search Magnifying Glass Icon in the Red Box  (and top right of this page)

One good thing about the new SAP Community Platform is the constant updates (maybe it needs more updates to fix all issues) but the above screenshot does not reflect the current status of the site. The magnifying icon has changed and maybe we all have started by now as the "Get Started" icon is no more ;).

I do not use the that actual search page a great deal (my search is mostly Google driven) but I did discover the documentation from that page to the SAP Search API I have found the actual search API a lot more flexible than the main search page. Options such as the ability changing the default sort order of the results. The default sort order is "relevance" from OneDX but I wanted to sort via the last update timestamp. I could achieve this via using search API calls rather than the limited options on the onedx page. The actual format of the search API is JSON based and needs very specific formatting (as per the documentation). I found that using my SAP Community User and Password I could control the process from my trial SAP Cloud Platform account. I used a combination of HANA capabilities on the SAP Cloud platform. I have created some XS Classic code for the collection of SAP Community data, subsequent OData service and to send out tweets via the twitter api.

The objective and overall intention of this blog was to use the search API to look for SAP Community information of interest and alert me via tweets. The example code and configuration below will monitor all the latest questions on the forums and alert me to any SAPUI5 primary tagged questions. I chose SAPUI5 primary tag just as an example for this blog. I could chose any of the primary tags of this site. I could alter the API calls to search for any topic but my main aim is to get the latest updated information from the site. Therefore my example XS code is based on the search API's updated timestamps fields of any questions.

To highlight the process in one screenshot I have just checked just now for the latest updated question on the site. The latest question was in the primary tag "BW (SAP Business Warehouse)". So I altered my code to tweet about this tag. As shown in the image below.

In the screenshot above the numbers represent the following

  1. Shows the latest questions

  2. First I ran my XS job to collect this latest data. Then via Web IDE and an UI5 based table ,reading an xsOData service, from my SCP account I can view these latest updated questions

  3. Finally the 3rd image shows the tweet I generate using XS code by checking for updated questions in the primary tag BW (SAP Business Warehouse)

(As to the question in the tweet? I have no idea if you should convert real time infocubes to HANA optimized ones, so I am switching the primary tag to something else now 😉 )

I cover another example of collecting data from this site in my other blog here. In that blog I analyse the forums from Oct 2016 to Apr 2017.

The collection of source data for that blog was adapted from the base code and the technical details in this blog.


Setup SAP Cloud Platform: SAP Search API & Base Twitter API setup

General Details

First some information in regards to the SAP Cloud Platform Trial accounts. It is worth noting that any Trial SAP Cloud Platform account and I mean the HANA MDC systems will be shutdown on a regular 12 hour schedule. I have no problem with that shutdown and my example is not meant to be setup running 24/7. It has always been and will remain a way for me to learn and practice using SAP technology. I do appreciate the fantastic options and solutions available on the SAP Cloud Platform. The fact that they are made available for free in the trial accounts is a BIG bonus for me as I can't make the investment in any Express version right now. Mentioning investment then it must be a BIGGER investment by SAP in the trial accounts! However I still do not like the fact that Trial accounts are deleted after a period of time. It seems a shame to invest a lot of time into something that could be deleted by any possible events out of my control. Back to my blog....

I have left the code and configuration below linked to the setup in my trial account. I have a schema and user called NEOGEODB. I setup a package for my XSc project called sapcomrjr. If you are interested in following any of the code then any reference to them in my code will obviously have to be adpated.

The NEOGEODB user has super power 🙂 type authorisations and added extra authorisations as required. I used an anonymous connection throughout and it is better for me to review the authorisations and anonymous setup at some point in the future with such a powerful user.

I use twitter quite a bit (mostly reading tweets and not actually tweeting a great deal) but I did not want to use my main account to tweet. So I set up a tweeting bot as in the above screen shot called @inXSc_rjruss For the Twitter account I followed the link to get the Twitter tokens required to actually tweet using XS Classic (XSc)

Twitter Bot setup -

I used the same approach with twitter bots but with SQL Anywhere when I analysed the    RSS feeds and blogged about it here 

The SAP Search API is flexible and I mainly used Chrome Dev tools to have a look how the format of the requests were being made when I was using the page.

The documentation can be found here.

Using some of the information I did use Postman before moving onto the SAPCP to ensure I was getting what I was asking for 🙂







Throwing SAP Search API A Curve Ball
Fig1. - Searching SAP API with an incorrect date and I was surprised to see the actual SQL statement with the error message




click on the arrow to see the image (Internet Explorer you will see the image and no need to click -> html5 details/summary tag)


Collecting Community Data and Tweeting from the SAP Cloud Platform

SAP HANA Web-based Development Workbench

I used the web based workbench for my setup and if required it may be better to start with an introduction to this via a developer tutorial ( using Option A on the SCP ) here  -  SAP HANA XS Classic, Develop your first SAP HANA XSC Application

HANA Table

I created a main table to collect Questions and an extra  column "TWEETED" to control the actual tweeting.

Table Name TWEET_SAP_Q

New Project files

An SQLCC file to be used for connecting to HANA



The Access file to remove authentication and link to the SQL connection used

.xsaccess file
"exposed" : true,
"authentication" : null,
"default_connection" : "sapcomrjr::AdminConn"

.xs file


SQL Connection Configuration

Access to setup the SQL connections I found using XS URL admin screens e.g.


Navigate to the XS artifacts - for me this was sapcomrjr as shown below the AdminConn.xssqlcc config is shown.

I could Edit the configuration to update the user to my NEOGEODB account. - as shown below.


Twitter API Oauth Signature & tokens

The twitter Oauth 1.0a tokens are vital to successfully tweet using my XS code below. I followed the Twitter developer site link below to create a signature with XS code. (I had already created the Oauth tokens for my twitter bot as mentioned in the earlier section of this blog.)

My XS code is setup specifically to tweet (send a status update) and is tightly integrated to the signature process detailed in the twitter help pages above. If any extra parameters or changes to the URL string for the Twitter API then my XS code signature process has to change to. The process is detailed Twitter documentation and any additional parameters in the API call would trigger a review of the XS code.

Setup The Secure Store

I called my XS secure store file secureStoreTest.xsjs. I could then use the secure store to separate the tokens from the source code, this way they would not be in plain view in the code. The tokens are automatically encrypted in the secure store. It is possible to revert back to simply storing the tokens as variables and bypass the secure store. However as a one off process I add the tokens to XS code to encrypt them in the secure store. I could then delete them from the XS code once I confirmed they had been successfully stored. I would use the secure store XS commands to retrieve these tokens every time they were required.

My setup is the following



I took my approach and based the code on Thomas Jung's blog which covers the secure store here


secureStoreTest.xsjs file
//I found it best to delete localstore before storing new values
function store() {
var config = {
name: "foo",
value: "[ \"{consumekey}\", \"{consumesec}\", \"{accesstok}\", \"{accesssec}\" ]"

var aStore = new $.security.Store("localStore1.xssecurestore");;
function read() {
var config = {
name: "foo"
try {
var store = new $.security.Store("localStore1.xssecurestore");
var value =;
var stJS = JSON.parse(value);
var outv = stJS[3];

$.response.contentType = "text/plain";
catch(ex) {
//do some error handling
var aCmd = $.request.parameters.get('cmd');
switch (aCmd) {
case "store":
case "read":
$.response.status = $.net.http.INTERNAL_SERVER_ERROR;
$.response.setBody('Invalid Command');


There are two options to setup the important Twitter tokens, I followed point 1 and used the secure store or you could simply follow option 2 and hard code the values in the source XS code.

1 Run the secureStoreTest.xsjs with the parameter ?cmd=store in a web browser. This will store the tokens safely in the secure store and I could delete the actual values from the code after a successful run of the following URL in my browser.


2 Hard code the variables in the main XS job below


Change the following section in the main q_scananswers.xsjs code to the appropriate Twitter token value.

// var consumekey = "CONSUMERKEY";
var consumekey = read(0);
// var consumesec = "CONSUMERSECURITY";
var consumesec = read(1);
// var accesstok = "ACCESSTOKEN";
var accesstok = read(2);
// var accesssec = "ACCESSSECURITY";
var accesssec = read(3);

XS Destinations

SAP Search API Destination

FileName scs.xshttpdest
host = "";
port = 443;
proxyType = http;
proxyHost = "proxy-trial";
proxyPort = 8080;
authType = basic;
useSSL = true;
timeout = 30000;
sslHostCheck = false;
sslAuth = anonymous;

Twitter API Destination

FileName twitter.xshttpdest
host = "";
port = 443;
proxyType = http;
proxyHost = "proxy-trial";
proxyPort = 8080;
useSSL = true;
timeout = 30000;
sslHostCheck = false;
sslAuth = anonymous;


Destination Configuration in XS Admin Screen

For the SAP Community Search destination I needed to configure my login details in the XS Admin screen, As shown below. The highlight User  (in the red box) needed my login details to this site.



At this stage there is no need to change the twitter.xshttpdest. The SSL Trust Store can be left as default from my experience as the overall SCP uses a list of certificates that Twitter trust.


Main XS Classic Code to Collect and Tweet About SAP Community Data

I called the main XSJS script "q_scanswers.xsjs"

It does come with some covering remarks, in that originally the code was in individual sections. E.g. the collection of Search API Community data and the process for tweeting was in two XS programs. I then combined it into the final working code below. Error checking any the validity of this XS code I leave that judgement to you. I am happy to answer questions on any part of the process and I do know it works as intended.

I already knew how to use the Twitter API Oauth process when I covered that with another great SAP product, SQL Anywhere.

I analysed SCN @SCNblogs timeline using SQL Anywhere and the help of UI5. The crypto and base64 utilities that come with XS make it possible to replicate my SQL Anywhere routines with the HANA in the SAPCP.

At this point I will thank thomas.jung for his contributions on the SAP Community site that helped me solve some of the issues I had along the way. Any errors or wrong interpretation of Thomas's blogs/answers are obviously mine. I reference the links at the end of my blog. If you think any part could change let me know , I would be happy to hear about it.

In its current form in this blog it is not setup to collect all questions.  My blog (if followed exactly) would collect SAP Questions every 15 minutes via the XS job schedule. It most likely will not read all questions into HANA unless it is adapted, in the search API JSON section of the code.
function read(v) {
var config = {
name: "foo"
try {
var store = new $.security.Store("localStore1.xssecurestore");
var value =;
var stJS = JSON.parse(value);
var outv = stJS[v];
return outv;
} catch (ex) {
//do some error handling

function tweet1(text2tweet) {
var destination_package = "sapcomrjr";
var destination_name = "twitter";
var randomString = function(length) {
var textr = "";
var possible = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
for (var i = 0; i < length; i++) {
textr += possible.charAt(Math.floor(Math.random() * possible.length));
return textr;

try {

var text = encodeURIComponent('@rjruss Still Working?');
var epn1 = new Date().getTime();
var ep2 = epn1.toString();
var epn = ep2.substring(0, 10);

var nonsense = randomString(6);
// text = encodeURIComponent('Q:' + epn);
text = encodeURIComponent(text2tweet);

var call = '/1.1/statuses/update.json?status=' + text;

var dest = $.net.http.readDestination(destination_package, destination_name);
var client = new $.net.http.Client();
var req = new $.web.WebRequest($.net.http.POST, call);
// var consumekey = "CONSUMERKEY";
var consumekey = read(0);
// var consumesec = "CONSUMERSECURITY";
var consumesec = read(1);
// var accesstok = "ACCESSTOKEN";
var accesstok = read(2);
// var accesssec = "ACCESSSECURITY";
var accesssec = read(3);

var t2 = 'oauth_consumer_key=' + consumekey;
var t3 = '&oauth_nonce=' + nonsense;
var t4 = '&oauth_signature_method=HMAC-SHA1';
var t5 = '&oauth_timestamp=' + epn;
var t6 = '&oauth_token=' + accesstok;
var t7 = '&oauth_version=1.0';
var t8 = '&status=' + text;

var tall = t2 + t3 + t4 + t5 + t6 + t7 + t8;
var tper = encodeURIComponent(tall);

var s1 = 'POST&';
var out = s1 + tper;

var sk1 = encodeURIComponent(consumesec);
var sk2 = encodeURIComponent(accesssec);
var skey = sk1 + '&' + sk2;

var skeyhmac = $.security.crypto.sha1(out, skey);
var tsignature = encodeURIComponent($.util.codec.encodeBase64(skeyhmac));

var h1 = 'OAuth oauth_consumer_key="' + consumekey + '"';
var h2 = ',oauth_token="' + accesstok + '"';
var h3 = ',oauth_signature_method="HMAC-SHA1"';
var h4 = ',oauth_timestamp="' + epn + '"';
var h5 = ',oauth_nonce="' + nonsense + '"';
var h6 = ',oauth_version="1.0"';
var h7 = ',oauth_signature="' + tsignature + '"';

var twithead = h1 + h2 + h3 + h4 + h5 + h6 + h7;

req.headers.set('Authorization', twithead);
client.request(req, dest);
var response = client.getResponse();
var rettw = response.status;
if (rettw === 200) {
return rettw;
} else {
var ebody = response.body.asString();
var errM = JSON.parse(ebody);
return errM.errors[0].code;
} catch (e) {
//////$.response.contentType = "text/plain";
return 'err';


function readAnswer() {
var destination_package = "sapcomrjr";
var destination_name = "scs";
var call = "/api/v1/search";

try {
var dest = $.net.http.readDestination(destination_package, destination_name);
var client = new $.net.http.Client();

var req = new $.net.http.Request($.net.http.POST, call);

req.headers.set('Content-Type', 'application/json; charset=UTF-8');
"returnResults": {
"page": {
"number": 0,
"size": 10
"sort": [{
"order": "desc"
"repository": "srh",
"type": "content",
"filters": [{
"field": "TYPE",
"values": ["question"]

var response = client.request(req, dest).getResponse();

var connection = $.hdb.getConnection({
"isolationLevel": $.hdb.isolation.REPEATABLE_READ,
"sapcomrjr.AdminConn": "package::sapcomrjr"

var list = [];
var body = response.body.asString();
var obj = JSON.parse(body);

var lp1 = obj.result.results.results;

for (var i = 0; i < lp1.length; i++) {

var vals = [];
for (var i = 0; i < lp1.length; i++) {
var valueToPush = [];
// vals.push(lp1[i].TITLE);
// vals.push(lp1[i].SCORE);
valueToPush[0] = lp1[i].ID;
valueToPush[1] = lp1[i].CLIENT;
valueToPush[2] = lp1[i].MODIFICATION_TIMESTAMP;
valueToPush[3] = lp1[i].EXTENT;
valueToPush[4] = lp1[i].TITLE;
valueToPush[5] = lp1[i].USERURL;
valueToPush[6] = lp1[i].SOURCE;
valueToPush[7] = JSON.stringify(lp1[i].KEYWORDS);
valueToPush[8] = lp1[i].LANGUAGE;
valueToPush[9] = lp1[i].AUTHOR;
valueToPush[10] = JSON.stringify(lp1[i].SM_TECH_IDS);
valueToPush[11] = lp1[i].CREATED_TIMESTAMP;
valueToPush[12] = lp1[i].UPDATED_TIMESTAMP;
valueToPush[13] = lp1[i].AUTHOR_ID;
valueToPush[14] = lp1[i].TYPE;
valueToPush[15] = lp1[i].LANGUAGE_VARIANT_IDS;
valueToPush[16] = lp1[i].TYPE_IDS;
valueToPush[17] = JSON.stringify(lp1[i].PRODUCT_FUNCTION_IDS);
valueToPush[18] = lp1[i].CONTENT;
valueToPush[19] = lp1[i].KEYWORDS[0];
//UPSERT repeat for where clause
valueToPush[20] = lp1[i].CREATED_TIMESTAMP;


} catch (e) {

// ////$.response.contentType = "text/plain";
// ////$.response.setBody(e.message);


function checkAnswer() {
var connection = $.hdb.getConnection({
"isolationLevel": $.hdb.isolation.REPEATABLE_READ,
"sapcomrjr.AdminConn": "package::sapcomrjr"
var sqlc =
var rs = connection.executeQuery(sqlc);
var vals = [];
for (var i = 0; i < rs.length; i++) {
var body = '@rjruss ';
var ti = rs[i]["TITLE"].substring(0, 58);
var us = rs[i]["USERURL"];
var au = rs[i]["AUTHOR"].substring(0, 14);
var it = rs[i]["ID"];
var ud = rs[i]["UPDATED_TIMESTAMP"];
var valueToPush = [];

body += ti + '.. by ' + au + '..| ' + us;
valueToPush[0] = ud;
valueToPush[1] = tweet1(body);
valueToPush[2] = it;
valueToPush[3] = it;

// body += ' link: ' + us + ' by ' + au + ' id ' + it;

var outb = JSON.stringify(vals);
try {
//upsert "NEOGEODB"."TWEET_SAP_Q"("TWEETED","ID") VALUES('sent','13847790') where ID = '13847790'
//UPSERT "NEOGEODB"."TESTU" (CLIENT) VALUES ('tes3t') where CLIENT = 'tes2t'


} catch (e) {
////$.response.contentType = "text/plain";

//////$.response.status = $.net.http.OK;


function control() {



XS Job

I use the HANA / SAP Cloud Platform XS job functionality to regularly run the main XSc function.

This required the following to set it up.

scan.xsjob Used to define the job and call the XSc code
"description": "Update SAP answers",
"action": "hihanaxs.sapcomrjr:q_scanswers.xsjs::control",
"schedules": [
"description": "Update SAP answers",
"xscron": "* * * * * */15 30"

My workflow for XS jobs on SCP is to save/activate the xsjob file then visit the XS admin page to activate both the job and overall schedule.

Activate XS Job

I selected "Active" and entered my user/password for the job.

Activate XS Schedule

The Job will not run unless the Scheduler is Enabled (top right of the above screenshot)

The Result

Once running the tweets are sent to my notification timeline.


OData Service

I used a simple xsodata service statement to create an OData feed as follows.
service {"NEOGEODB"."TWEET_SAP_Q" as "SAF"  keys generate local "GENERATED_ID" ;}

As there is no primary key on my TWEET_SAP_Q table I use the xsodata functionality to create one called GENERATED_ID. I could test this service in the workbench and once I confirmed it was working I could use the OData feed in my WebIDE as follows.


SCP Destination

I added the following destination in my SCP cockpit to point back to my own SCP HANA xsodata service.

The URL is blanked out in the screenshot but is my trial account link.

Web IDE Config

I tested a simple table via a service URL call to my OData service.

For me the key step in the table definition was the following OData bindrows section
path: "/SAF",
parameters: {
// operationMode: sap.ui.model.odata.OperationMode.Client
operationMode: sap.ui.model.odata.OperationMode.Server


This allows me to monitor the last updated questions in my table and filter for successful Twitter status codes (return status 200 indicates a successful call to Twitter).


HANA Calculation View and SAP Lumira Setup

To analyse the data I had collected I used SAP Lumira as mentioned in the opening of this blog. I setup a couple of calculation views based on a collection table. The date range is from 10 Oct 2016 to 30th April 2017.


From the SAP Search API data I added calculated columns for QUESTIONS and AUTHORS (counter based) and the CALCULATED_TIMESTAMP adapted to DATE YYYY-MM-DD format, HOUR and finally DAYNAME, I used these columns to analyse the data as per the start of this blog.

***Accessing the SAPCP from Lumira works again after the fix noted in the question/answer below. My thanks to Jin for the answer to my question.

****THE Following Section for Lumira is broken right now (18/5/2017). I have actually asked question on SAPCP tag myself in regards to a broken tunnel connection. I am not alone as someone else has commented that the tunnel connections to the cloud is not working as expected. *it works again - see the link for the answer.


To connect Lumira to my trial SCP account I followed an approach I described in this blog via an SCP tunnel. Although the latest MDC SCP accounts any database user can be used. So I use my neogeodb user which I can save the details in Lumira. However the "SAP Cloud Platform Console Client" tunnel needs to be open to successfully connect.


Example chart I was able to produce analysing the SAP Community with Lumira connected to my SAPCP trial account is below.



That completes my blog and thanks for reading.

Best Regards

Robert Russell

The following links are to reference pages that I found useful for the complete setup.

SAP HANA XS JavaScript API Reference
XSc Security API
XSc Namespace: crypto$.security.crypto.htm...
Batch Insert
SQLCC config

Labels in this area