Spend Management Blogs by SAP
Stay current on SAP Ariba for direct and indirect spend, SAP Fieldglass for workforce management, and SAP Concur for travel and expense with blog posts by SAP.
cancel
Showing results for 
Search instead for 
Did you mean: 
ajmaradiaga
Developer Advocate
Developer Advocate
3,534
In this blog post, I will cover how we can move analytical data available in the SAP Ariba APIs to SAP HANA Cloud. To achieve this, I will use the trial version of the Integration Suite and SAP HANA Cloud available in SAP Cloud Platform. I will build an integration flow in SAP Cloud Platform Integration and use the recently released JDBC adapter for SAP Cloud Platform Integration (Cloud Foundry environment) to send data to SAP HANA Cloud.

In a previous blog post, I covered how to use SAP Cloud Platform Integration to replicate SAP Ariba analytical data. This blog post is a continuation of it but will focus just on what is required to get it working with SAP HANA Cloud.

Update 1: If you are interested in knowing how to configure step by step a connection from SAP Cloud Integration to the SAP Ariba API, , including sample code and sample integration flow, please check out this blog post: https://blogs.sap.com/2021/03/22/step-by-step-how-to-configure-sap-cloud-integration-to-communicate-...

To complete the steps explained in this blog post, there are some prerequisites that we will need to complete first:

Now that we have access to the different systems, I will proceed to explain what we need in to get the SAP Cloud Platform Integration integration flow to SAP HANA Cloud.

  1. Deploy JDBC security material in SAP Cloud Platform Integration

  2. Update the integration flow components

  3. Modify Process Ariba response script

  4. Deploy the integration flow


The integration flow end result is shown in Fig. 1.


Fig. 1 - Integration flow to SAP HANA Cloud



Step 1 - Deploy JDBC material in SAP Cloud Platform Integration


 

Go to your SAP Cloud Platform Integration instance and create/deploy 2 security materials (Monitor > Manage Security > JDBC Material). This will be used by the integration flow to communicate with SAP HANA Cloud.


Fig. 2 - JDBC material



I'm using DBADMIN for simplicity purposes, this is not recommended for production. Sample JDBC URL: jdbc:sap://56acb719-b476-xxxx-bd11-xxxxxxxxx.hana.trial-eu10.hanacloud.ondemand.com:443/?encrypt=true. Note that the URL will vary depending on the region of your account and the port in the connection string is 443.

Step 2 - Update the integration flow components


 

Remove the modify content and Set Authorization components. Add a General Splitter and Javascript script components after the Process Ariba response script.

The integration flow should look like the integration flow in Fig. 1. Below, the details on what each of the new components are doing:

  • Modify the Process Ariba response script: Instead of outputting JSON lines, the script will not be outputting INSERT SQL statements.


importClass(com.sap.gateway.ip.core.customdev.util.Message);
importClass(java.util.HashMap);

function processData(message) {
var messageLog = messageLogFactory.getMessageLog(message)

//Parsing body to JSON
var body = JSON.parse(message.getBody(new java.lang.String().getClass()));

/* ===========
Handle PageToken
=============*/

// Retrieving PageToken from payload if one exists
if("PageToken" in body) {
messageLog.setStringProperty("PageToken", body['PageToken']);
message.setHeader("pageToken", body["PageToken"]);
} else {
messageLog.setStringProperty("PageToken", "NONE!");
message.setHeader("pageToken", "STOP");
}

/* ===========
Create payload
=============*/

var sqlContents = "SQL Statement\n";

var i = 0;
var arr = body['Records'];

for(var x = 0; x < arr.length; x++) {
var record = arr[x];

messageLog.setStringProperty("record", record);
i += 1;

// Create SQL Script
sqlContents += "INSERT INTO ARIBA_AR.SOURCING_PROJECTS (REALM, PROJECT_ID, TITLE) VALUES ('myrealm-T', '" + record['InternalId'] + "', '" + record['Title'] + "');\n";
}

messageLog.setStringProperty("TotalRecordsProcessed", i);
messageLog.setStringProperty("Query", sqlContents);

message.setBody(sqlContents);

return message;
}


  • General Splitter: The expression type is Line Break and its purpose is to process each INSERT SQL statement separately.



SQL Statement
INSERT INTO ARIBA_AR.SOURCING_PROJECTS (REALM, PROJECT_ID, TITLE) VALUES ('myrealm-T', 'WS12345', 'My project title');



  • Set first line as body (JS script): The splitter will output a body like the one above. The script will just set the INSERT SQL statement, 2nd line, as the body.



importClass(com.sap.gateway.ip.core.customdev.util.Message);
importClass(java.util.HashMap);
function processData(message) {
var messageLog = messageLogFactory.getMessageLog(message)

//body
var body = message.getBody(new java.lang.String().getClass());
var lines = body.split("\n");

var sqlStatement = lines[1];

messageLog.setStringProperty("INSERTStatement", sqlStatement);

message.setBody(sqlStatement);

return message;
}



  • Set the target system adapter and name: Remove the connection to the target system, create a new one and set the adapter type as JDBC. In the connection details we set the JDBC Data Source Alias which is the name of the JDBC material, e.g. HANACloudTrial_JDBC. See Fig. 2. Set the name of the target system to something meaningful.


Now that all steps are completed, we deploy the integration flow and check the records created in SAP HANA Cloud.


Fig. 10 - SAP HANA Database Explorer


As we can see, we have replicated the SAP Ariba data to SAP HANA Cloud. We can now use this dataset to create a report/dashboard in reporting tools that connect to SAP HANA Cloud, e.g. SAP Analytics Cloud.
3 Comments