Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Active Contributor
Hello Folks,

One more interesting blog to share the knowledge and experience with you, based on test cases, basically let’s follow the topics and instructions during the reading.


  1. Introduction

  2. Event Driven Architecture (EDA)

  3. Scenario and Integration Perspective

  4. Iflow - SAP CPI

  5. Import detail - Don't be afraid, GROOVY it !!!

    1. XSLT Details

What I will not cover in the blog ?

  • The whole setup of S4HANA (Connections and abap code).

  • The configuration of SAP EM (Queue,webhook and further details).

  • The whole setup of Iflow - just important parts.


I would like to share is how the Event Driven Architecture and how this can be a game change in the new prespective of integration definitions, with new model focous in simple integrations based on microservices.

SAP Integration Suite (EM and CPI) can support your bussiness in this perspective and explore fully one scenario that I described in my book - SAP Enterprise Message. - That I wrote together with co-author Former Member

The focous of this blog is the continuos of my previous blog to use SAP Enterprise Message with the webhook mechanis ( API PUSH ) as you can see here - SAP EM – Webhook mechanism – SAP CPI.

Basically when the material suffer any change state of the data as Creation or Modification (Update/Delete) the event will be triggered automatically from S4HANA using a Z Event ( This is not a custom event from the ADD-ON - Event Ennabling because the VDI test that I use to develop it has only version 1809 that does not support this customization.

Event Driven Architecture (EDA):

In a very simple way to understand event driven architecture (EDA) means defines an event as a “significant change in state”.

Event-driven architecture (EDA) is a software design pattern in which decoupled applications can asynchronously publish and subscribe to events via an event broker (modern messaging-oriented-middleware).

SAP provides the event enablement Add-On in ECC and S4HANA (Cloud and On-Premise) to support you with that.

In case that you want explore and understand more about this architecture definition, I recommend you to buy Event-Driven Architecture with SAP

Scenario and Integration Perspective

As you already understand the potencial of SAP Integration Suite (EM and CPI ) let's recapt simple about both services.

SAP EM is cloud event broker to handle exchange of events and messages.

SAP CPI as an iPaaS tool, as it offers a wide range of capabilities vis-à-vis: security, connectivity, various messaging protocols, and rapid development and deployment with the use of industry standard prepackaged integration contents.

Integration Perspective:

The scenario propose in this blog as mentioned above always when “significant change in state” of the material happens in the backend system S4HANA the Z EVENT will be triggered automatically to SAP EM and via webhook (push) the message to SAP CPI that will be responsable to routing based on the STATUS of the material (M or C), read the API - API_PRODUCT_SRV, retriver the values, make a filter using XSLT and send to one or multicast receivers based on PLANT that this material belong.

Iflow - SAP CPI

Basically I will not present the Iflow for routing and explain you some details of this prespective of share about others systems the change state of the material.

As you can see there is many local process, yes, I decide make like this to mitigate the error handling in case of problems of connections and to make more clean the IFLOW.

  1. Process Direct

  2. Get the product from the json event and save as property

    1. The ID will be used for API - GET Read

  3. Local Process call API_PRODUCT_SRV

  4. Exception in case of.

  5. Call the ODATA API

  6. Exception ODATA in case of

  7. XSLT to filter language and plant

  8. Groovy to setup up the routing details

  9. Routing - Single calls or Multicast deliver in case of the material is in more of one plant in the register.

  10. In case of Single delivery or Multicast

  11. In case of deliverable for the External System 1

    1. 14 - Call local process of Token first.

    2. 15 - Exception in case of

    3. 16 - Retrive from memory the result of the ODATA API call to build the final message with access token in the header in the next local process.

    4. 17 - Groovy Mapping - XML to JSON

    5. 18 - API Call

    6. 19 - Exception in case of.

  12. Local Process System 2

    1. 14 - Groovy mapping XML to JSON

    2. 15 - Call webservice

    3. 16 - Exception in case of.

  13. Local Process System 3

    1. 14 - Groovy mapping XML to JSON

    2. 15 - Call webservice

    3. 16 - Exception in case of

  14. Muticast for Process System 2 and 3.

    1. Because the material contains in the same plant in the backend system.

The XML result from API:

As you can see the API did a good job but not exactly needed, if you try to use $expand and $filter, but the (filter)  is a list, is a problem, something that you can't make in this case because I'm expanding some entities of ODATA and use also filter can geneate the famouse error:

Because of that I decide go for XSLT filter to also use the property related with PLANT saved in the first goovy of this Iflow.

Import detail - Don't be afraid, GROOVY it !!!

I high light recommend GROOVY your life and for that you must order and read the SAP Press ebite from @engswee.yeoh and @Vadim.klimov - Developing Groovy Scripts for SAP Cloud Platform Integration.

As you can see all mapping details and get properties and headers, I decide for groovy.

XSLT Details and first groovy:

The first groovy in the flow is reponsable to parsing the JSON, leading zeros of PRODUCT and create a property CENTRO
import java.util.HashMap;
import groovy.json.*;
import java.util.regex.*;
def Message processData(Message message) {
def map = message.getProperties()
Reader reader = message.getBody(Reader)
def json = new JsonSlurper().parse(reader)
String product = json.product
String test1 = json.ThirdParty.test1
String test2 = json.ThirdParty.test2
String test3 = json.ThirdParty.test3
message.setProperty("CodeProduct", product.replaceFirst("^0+(?!${0})", ""))
if (test1 != null && !test1.isEmpty() && test2 != null && !test2.isEmpty()){
if(test1 != null && !test1.isEmpty()){
message.setHeader("test1", "X")
if (test2 != null && !test2.isEmpty()){
message.setHeader("test2", "X")
if (test3 != null && !test3.isEmpty()){
message.setHeader("test3", "X")
def stringPlant = json.Plant
return message

Stantard Event and ZEvent

"eventType": "BO.Product.Changed",
"cloudEventsVersion": "0.1",
"source": "https://sap.corp",
"eventID": "Aop3xCdEHtuvrK3F3Izrug==",
"eventTime": "2021-05-25T14:28:10Z",
"schemaURL": "https://sap.corp/sap/opu/odata/IWXBE/BROWSER_SRV/",
"contentType": "application/json",
"data": {
"KEY": [
"PRODUCT": "000000000000000015"

"Product": "000000000000000015",
"Status": "M",
"Thirdparty": {
"Test1": "X",
"Test2": "X",
"Plants": "T191,S039"

The values into KEY is array, if you don't that, he will extract the value as:

  • 000000000000000015

  • Status - M or C

Regex in Groovy to remove leading zeros:

  • You need import the lib:import java.util.regex.*;

  • Code: .replaceFirst("^0+(?!${0})", ""))

The param - Plants is reading from previous property.

Now let's discuss the XSLT remove the generic result call from the API_PRODUCT_SRV that return every detail from the material but the import was to filter by language and which plant this material belongs to don't send wrong message to the system that should not received those details.

I decide implement this logic in abap side as you can see in the ZEvent:

  • ThirdParty - Who should receive this ZEvent

  • Plants - To filter and exclude from the result API call.

<?xml version="1.0" encoding="UTF-8" ?>
<xsl:transform xmlns:xsl="" version="1.0">
<xsl:output method="xml" omit-xml-declaration="yes" encoding="UTF-8" indent="yes" />
<xsl:param name="Plants"/>
<xsl:strip-space elements="*"/>

<xsl:template match="@*|node()">
<xsl:apply-templates select="@*|node()"/>

<xsl:template match="A_ProductDescriptionType[Language!='EN']"/>
<xsl:template match="A_ProductPlantType[not(contains($Plants,Plant))]"/>



Why you add this logic in abap to check ? 

The point is when you check the API_PRODUCT_SRV based on the product code that comes in the JSON ZEvent this API will return all values related with this material and this is a problem, comes with all languanges installed in the system and also all plants that this material is belongs, THIS IS NOT WRONG, but this is not funcional for the SAP CPI really determine which systems must receive this update of the material (Create or Change).

Because of that I decise push this logic to ABAP side to check when the material suffer any state of the data, provide me the thirdparty system as a list that will be used for routing perspective.

Take look in the full sample from result of API:
<?xml version="1.0" encoding="UTF-8"?>
<ProductDescription>Handelsware 14, PD, Zukauf, H14</ProductDescription>
<ProductDescription>Trad.Good 14,PD,Bought-In,H14</ProductDescription>
<ProductDescription>TESTE OPERADOR 2021</ProductDescription>
<ProductDescription>Mercadería 14, PD, comprado, H14</ProductDescription>

I belive seeing the result you are able to understand the problem.

With the XSLT Filter describe above to select only language EN and specific PLANT - M016 and T191 from ZEvent, so as you can see in the result Plant - S039 and T161 is out.

To solve this issue, we create the Z custom event in the S4/HANA.

The solution was before send the ZEvent check the tables MARA , MARM and MARC, independently if the material just suffer change in the description, this logic is to guaranty that always SAP CPI will know to with third party system must receive the data.

Independently which material master view that was affected.

I really hope that you enjoy the read and also start to think foward of event driven architecture with SAP products together with  SAP ODATA API's from S4 to support you better and change the approach of classic integration model.


Kind regards,


Labels in this area