Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
YannickSchaper
Product and Topic Expert
Product and Topic Expert
2,092
When I was a student, life was simpler. Most of the data was send to me in an excel or a csv file from a PHD student. Of course, we then explored the data, did a lot of statistics, and created a machine learning model. To show the results the normal plot function was enough, and I didn’t have to think about deployment or process integration. If that sounds familiar to you, let me know in the comments 🙂 Now, to deploy my work into the business I face a broad range of challenges. First, the results need to be consumable for different users and fit into the business process. Further, based on the results additional processes need to be triggered or even changed. In this blog post I want to show you how life can still be simple when we have the right tools at hand.

Therefore, we will use two very powerful solutions available in the SAP Business Technology Platform: Kyma Runtime and the SAP Integration Suite. The Kyma Runtime gives us a lot of flexibility to create extensions or deploy docker images, containing eg. machine learning models or custom Python and R functions. Through the SAP Integration Suite, specific processes can be created which can make use of the functions developed in the Kyma Runtime. Sounds complex? Let me guide you through it.

Imagine a use case in which we want to find fraudulent transactions through machine learning. The standalone machine learning model is useless if it is not integrated into business processes. Our goal is to embed this machine learning logic directly into a process, so that the execution is dependent on the prediction. For instance, we might want to stop the transaction if the probability of a fraudulent case is above a certain threshold. Further, an employee should get a notification with a collection of the predicted fraud cases. Of course, we want to automate this as much as possible so that the false positives don’t blow up the mailbox. In conclusion, we need an orchestration for the usage of this machine learning model. Hence, let’s take a look at a simple example how a data scientist can hand over his machine learning model deployed in the Kyma Runtime to a developer in the SAP Integration Suite.

What will you learn in this Hands-On tutorial?

  1. Set up the REST API and Authentication in the Kyma Runtime

  2. Set up the Authentication in the SAP Integration Suite

  3. Create a first process in the SAP Integration Suite


What are the requirements?


  1. Set up the REST API and Authentication in the Kyma Runtime


Let’s start in the Kyma Runtime, where the deployed machine learning model can be consumed through a REST API. In the following, you will set up the REST API protected through an OAuth2 authentication.

First, move to your namespace in which you deployed your docker image or function and set up the authentication for the API rule.


On the left choose “OAuth Clients” and create a new one.


Give it a name and choose “ID token” as well as “Client credentials”. Further, enter “read” as value into the Scope. Then click “Create”.

Please take note of the decoded secret and Client ID. You will need these credentials in the second step 🙂


Now, move to “API Rules” and create a new one. Through the API rule we will be able to create predictions through the deployed machine learning model on the fly.


Provide a name and a hostname for your API rule. In addition, choose OAuth2 as the access strategy and find your service. Then choose “Create”.


As last step in the Kyma Runtime save the certificate of the website as a file. In Windows using Google Chrome you must click the lock signal next to the URL, choose “Certificate” and “Copy to File”.


Perfect! You already finished the first step of this Hands-On tutorial in the Kyma Runtime.

  1. Set up the Authentication in the SAP Integration Suite


In the second step you will set up the authentication in the SAP Integration Suite, such that you can use the API rule created in the Kyma Runtime in an Integration Flow. Hence, move to your SAP Integration Suite and choose the Cloud Integration Scenario.



First, move to the Monitor area where you can add the certificate and authentication. Under Manage Security go to “Keystore”.


Choose “Certificate” under “Add” on the right.


Provide a name and browse for the Kyma certificate. Then press “Add”.


Next, add the OAuth Credentials. Move to “Manage Security Material”.


Click “Create” and choose “OAuth 2 Client Credentials”.



Provide a name and a description. Then, add the Token Service URL which consists out of:
https://oauth2.<Domain>/oauth2/token

Further, copy the decoded Client ID as well as secret from the Kyma Runtime into the according fields. Change the Client Authentication to “Send as Request Header”. Check “Include Scope” and write “read” into the Scope. Set the Content Type to “application/x-www-form-urlencoded”. Then click “Deploy”.


Great! You finished the second step successfully 😃

  1. Create a first process in the SAP Integration Suite


Now, let’s tackle the final step and bring everything together in a first Integration Flow. Therefore, move to the Design area and create a new package.



In the package move to Artifacts and add a new Integration Flow.


Give the Integration Flow a name & ID and click “OK”.


Click on to your Integration Flow.


Remove the start message in your Integration Flow.



Further, add a Timer as the start for the Integration Flow.


Join the “Start Timer” and “End” through a connection.


Under External Call choose the “Request Reply” operator and add it to the Integration Flow.


Further, drag & drop a “Receiver” under the Integration Flow.


Then connect the Request Reply with the Receiver.


Choose “HTTP” as the Adapter Type.


Pull the HTTP configuration up to add the connection.


Add the “Address” as well as the “Query” separately into the according fields.
Adress: <your REST API>/predict

Query: c=0&a=4900&obo=10000&nbo=5100&obd=1000&nbd=5900&dl=1

In the query a new observation is incorporated which will be send to the machine learning model through the GET request. The result will be a prediction for this new observation which will be incorporated in the message payload as a string. You can change the values of the transaction incorporated in the query to get different predictions.

To finish the configuration please mark “Send Body” and choose “OAuth2 Client Credentials” as Authentication. Provide your “Credential Name”.


Now, add a “Groovy Script” after the “Request Reply” into your Integration Flow.





Click on to the “Groovy Script” and choose “Create”.


By default, a comprehensive script is provided to work with the process data.


Of course, you are now very flexible to work with the message. Recall the application returns the following string after creating a prediction:
'The predicted result for the observation ' + str(observation) + ' is: ' + str(prediction)

In the following script the prediction is extracted of the string by setting the payload between the index places 93 and 95.


Save the script and then deploy the Integration Flow.


Move to the Monitoring area to enable the trace of the Integration Flow to get more information after the execution.


Choose “All” under “Manage Integration Content”. Click on to your Integration Flow.


Change the Log Configuration to “Trace”.


Please, deploy your Integration Flow again.


Then choose “All Integration Flows” under Monitor Message Processing.


Click on to the newest deployment of your Integration Flow and choose “Trace”.



Choose the “End” of your Integration Flow and click on to “Message Content”. Under “Payload” we find our prediction of the observation. The value zero means that the prediction of the machine learning model for the transaction is non fraudulent.


Congratulations! You extracted the prediction incorporated in the message body successfully. Hence, you have now the basis to extend the Integration Flow as needed for your business. For example, you could persist the prediction through a JDBC connection in SAP HANA Cloud, create custom notifications or make the whole Integration Flow more dynamic. In addition, there are many prepared content packages available in the Discovery Center. If you want to explore more, the following blog posts and tutorials really helped me to get started:

I want to thank gabbi, svenhuberti and sarah.detzler for their support while writing this Hands-On tutorial.

Cheers!

Yannick Schaper

 
1 Comment