Events aren't exactly new but still require a bit of a mindset change. Instead of your point-to-point integration and interactions, event-driven architecture works differently and needs a slightly different approach. With this blog I showcase a full end-to-end scenario for both, local consumption and remote consumption in the hopes of explaining the event framework and give some ideas how you can adopt it and design new, smart solutions.
An event is a significant change in state. Notification or data events can be sent from the event source to an event broker to inform about this change. Event consumers can register at the event broker to be informed about certain events, that they are interested in.
Events aren't new in SAP and used to be (mostly) handled via BOR. In this blog we will look at the new event framework for RAP (ABAP RESTful Programming Model) BO (Business Objects).
Starting with the simpler example: local consumption. In my experience not everyone seems to be aware that is a thing for RAP BO, but it is. Please refer to our SAP Help page for any further info: Event Consumption | SAP Help.
With local consumption, you don't have need for an event broker, because you are staying within the same system. You can directly register your local consumption for a certain RAP BO event, and when it is fired it will call your code. Here's a full example.
First of, we need of course the event we want to react to. For this we can check on our Business Accelerator Hub by first selecting our product, for example SAP S/4HANA Cloud, private edition. Afterwards there is the category Events - Event Objects. You should see something like this:
Here we can then search our business context, for example Sales Order (tip: always search for the main object, not for example for sales order item etc.). When we found it we can click on the appropriate tile:
Read carefully the information and whether or not the event you actually need is present. This is important. For simpleness sake I will use the "Changed" event (please note I'm not picking the one called "Item changed", I will get back to this later). Now sadly, from here we don't have a direct link to the corresponding RAP BO, other than the name of the object, meaning "Sales Order" for "Sales Order Events". For me personally that is enough to figure out the RAP BO, but what if you don't know? Simple, we do 1 more search on the HUB. Instead of tab Events in the HUB (still selected on your chosen product) click on On Stack Extensibility - Business Object Interfaces and from here search for your event object, in my case "Sales Order":
Clicking on the appropriate tile will take you to the RAP BO interface which will give you its technical name (and show the released state so we can be sure we are clean core compliant). In our case that's I_SalesOrderTP. Now, when working with events, we often need the BO itself, not the interface, because interfaces do not always include events. Meaning for my demo here, my RAP BO name is R_SalesOrderTP instead (if you are confused here, I recommend learning the RAP and VDM naming conventions 😉, for I do not have the space to get into that here too).
To register a local consumption, all we need to do is create an ABAP class, that then specifies the RAP BO for which it is handling events, in my case:
CLASS zcl_salesord_local_event_handl DEFINITION
PUBLIC
ABSTRACT
FINAL
FOR EVENTS OF R_SalesOrderTP.
PUBLIC SECTION.
PROTECTED SECTION.
PRIVATE SECTION.
ENDCLASS.
CLASS zcl_salesord_local_event_handl IMPLEMENTATION.
ENDCLASS.The most important parts above are FOR EVENTS OF <name of behavior definition of RAP BO>. I'm also gonna be honest here and say that R_SalesOrderTP does not have a C1 release contract in my S/4HANA demo system, so I am forced to create this class as Standard ABAP, instead of ABAP for Cloud Development, which would be the goal. Always try ABAP Cloud first.
Next we need to create a local handler class under Local Types which inherits from CL_ABAP_BEHAVIOR_EVENT_HANDLER:
CLASS lcl_so_event_cons DEFINITION INHERITING FROM cl_abap_behavior_event_handler.
PRIVATE SECTION.
CONSTANTS:
" this is a local name and does not come from the RAP BO
co_changed_event TYPE string VALUE 'SLSORDCHN' ##NO_TEXT.
METHODS:
consume_changed FOR ENTITY EVENT changed_instances FOR salesorder~changed.
ENDCLASS.
CLASS lcl_so_event_cons IMPLEMENTATION.
METHOD consume_changed.
ENDMETHOD.
ENDCLASS.Here we define a method per event we want to listen to. For our case that is salesorder~changed, which is stated after the FOR word. Here you must type an actual entity and event that exists on the previously mentioned behavior definition. We also give a name to the importing payload, in my case I named it changed_instances (it could have a different name).
And here we are already. You now have a method, in my case consume_changed, that is called when the corresponding event is fired. We receive the payload of the RAP BO event and can now do whatever we need to do. Important to note is that events are fired after all commits have been completed and were successful. This is to ensure that the state change really did occur. So you do not need to worry about being in the middle of save sequence at this point - you are free to write your code as you need.
For simpleness sake for this showcase, I had created a z-table which I will simply update here, to see that my local consumption is running. I want to mentioned immediately here that this is purely for demo purposes: for real scenarios the code should be written with ABAP Cloud and direct DB access would not be allowed in such a case. Bear that in mind! So here the really bad, bad, code, but simple demo case 😁:
METHOD consume_changed.
DATA: ls_event_cons TYPE zevent_tst,
lt_event_cons TYPE TABLE OF zevent_tst.
DATA(today) = cl_abap_context_info=>get_system_date( ).
" set data for db update
ls_event_cons-payload = |Local Event Handler Called|.
ls_event_cons-eventtype = co_changed_event.
ls_event_cons-recorddate = today.
ls_event_cons-recordtime = cl_abap_context_info=>get_system_time( ).
append ls_event_cons to lt_event_cons.
LOOP AT changed_instances INTO DATA(instance).
ls_event_cons-payload = |{ instance-salesorder } { instance-salesordertype } { instance-salesorganization } { instance-soldtoparty }|.
ls_event_cons-recordtime = cl_abap_context_info=>get_system_time( ) + 5.
append ls_event_cons to lt_event_cons.
ENDLOOP.
" cleanse the test data table
DELETE FROM zevent_tst WHERE eventtype = @co_changed_event
AND recorddate < @today.
" insert new payload for testing
INSERT zevent_tst FROM TABLE @lt_event_cons.
ENDMETHOD.With above code, we write at least 2 lines per event trigger: one with a dummy payload, simply containing a string, and then one more line per changed object. Now in the case of Sales Order Changed event (remember I picked Changed, for the header, not ItemChanged), this will always just be 1. If we had listened to ItemChanged instead, then we could receive several lines in changed_instances.
Let's see if this works. Now without taking tons of screenshots, I am opening Sales Order Manage Fiori app and change my sales order 402. I simply change the Customer Reference field with an updated text, and then save. Let's check my table:
It has been updated (and you can see today's date and time - if I manage to post this blog today -- spoiler: I did not 😆), so we confirmed my local consumption was called.
Since event triggering happens with background units, you won't be able to directly debug this. But you can check monitoring transaction SBGRFCMON or bgRFC Monitor. There you should have Inbound queue BGPF; and this is the perfect point to mention that your system admin needs to have fully setup BGPF for local event consumption to work. But if something went wrong, for example an error in your code, a unit would be here with a warning or error flag. You can analyze the error and also start the debugger from here. If there is no unit listed here, it went through.
And that's it, local consumption complete. So this should be used instead of the old BTE, where we wrote our code to react to a BOR event.
For remote consumption you must have an event broker. At SAP, we have SAP (Advanced) Event Mesh on SAP BTP for that purpose. For this showcase, I assume you have already subscribed to SAP Event Mesh. From then on I will guide you.
As before, we can search on Business Accelerator Hub to find our event. This time let's take something less common (all demos seem to be on Sales Orders 😆 ) - let's say I want to react when a status of a WBS element changed. So on the HUB I will not search for "WBS", but for "Project", because that is the encapsulating, main business context:
I will go with the first, Project Events. This time I will switch to the Events Reference tab, to get the full path of my event. Let's see if we find status changed for WBS element:
Perfect! That's all we need from the HUB.
I am doing the on-premise / private cloud version here, but of course you can do the same for public cloud: the following blog explains the setup for using SAP Build Process Automation. Even without Process Automation, the starting setup steps are the same: Event Mesh and SAP S/4HANA Public Cloud Setup.
Continuing with on-premise / private cloud. As mentioned before, assuming your subaccount has Event Mesh are already subscribed to. Navigate to your subaccount on your BTP cockpit and open Instances and Subscriptions. Filter for Event Mesh and if you do not yet have an instance as well as a subscription, be sure to create one (if you need help with that, please follow Create Event Mesh Instance | Developer Tutorial).
Next, click on your instance and if there is no service key yet, create one. Finally download or view the service key through the ... button, we will need it soon:
Now logon to your SAP S/4HANA through SAP GUI and enter transaction /n/IWXBE/CONFIG.
Press create button via Service Key to create a new channel:
In the popup, give your channel a name and paste the service key JSON, which you had downloaded/have still open on BTP:
The channel will now appear in the list, but it's not yet active. Mark your channel and click on the Activate button. Wait for a while, refresh, and the status icon in the first column should change to green:
You can also click "Check connection" to be sure your system is successfully connected to SAP BTP Event Mesh.
Now that our system is connected to an event broker, we need to configure which events we want to forward to that channel that we created. For this, go back to /n/IWXBE/CONFIG and mark your active and green channel. Then click on Outbound Bindings button in the menu. From here we click on the create icon button to add a new binding for our WBS element status changed event. Easiest way to do this is to come back to the path we had found on the HUB. It was this: SUB ce/sap/s4/beh/project/v1/Project/WBSElmntStsChanged/v1. As the "sub" suggests, this is for subscribing to the event. Within the system, we can disregard the starting "ce/" (consume event) and search for the rest:
You can also search generally with an asterix * etc., if needed. Once the outbound binding is created, it should look something like this:
The outbound binding we just created has no filter, meaning anytime that event triggers, it will be forwarded to Event Mesh. Now depending on your integration patterns and strategy, maybe that is exactly what you want and you handle your event consumers to receive their appropriate events in middleware or on the consumers themselves. But if you do not want to send everything to SAP Event Mesh, in that case we can filter it right here in our event channel configuration, before the event goes out. From the same location (your SAP Event Mesh channel - Outbound Bindings) you can click on the Filters button in the menu. From here you have to expand the Outbound Bindings folder and then select your event in the tree on the left side. Then you will see the 3 standard filters available:
Not enough? Probably not. But fear not, you can create custom filters with derived events. Because this blog post is already big, I will not go into details here. But with a derived event, you can define your own custom payload for a SAP standard event. In your payload CDS, you can add annotation @event.context.attribute: 'xsap<FIELDNAME>' to fields you want to become filters. This is step one. The derived event is then created by extending the corresponding behavior definition (this requires a released C0 contract!) and giving your CDS as the new payload. Lastly we need an event binding for our derived event, and then the config in /n/IWXBE/CONFIG as described before, for our custom event binding. Now if you navigate to Filters, you will see the fields you annotated as well and can use those as filters. I know this was very short so you can also refer to Maintaining Filters for Outbound Event Topics | SAP Help.
With the above, our S/4 is ready to send events. Now we need to receive them on SAP Event Mesh. Go back to your BTP subaccount under Instances and Subscriptions and this time open the Event Mesh subscription, which brings you to the Event Mesh UI. Under Message Clients you should find your instance you previously created:
Click on it and then navigate to Queues. Click on Create Queue. Simply give the queue a name and see if you want to change any of the default properties. I chose "project" as my name (note that the namespace you picked when you created the Event Mesh instance (nordic/em/dev in my case) is added as a prefix):
Once the queue is created, open the Actions menu and click on Queue Subscriptions. Here now we paste the full SUB path of our event from the HUB: ce/sap/s4/beh/project/v1/Project/WBSElmntStsChanged/v1. After pasting this string, before you click on add you must yourself write the namespace as prefix, meaning in the end the topic would be: nordic/em/dev/ce/sap/s4/beh/project/v1/Project/WBSElmntStsChanged/v1. And don't worry, you can't do wrong. When you created the instance you also had to enter rules and those rules make sure you won't forget the namespace, because you get an error if you try to add a topic without it. In the end, your Queue Subscriptions should look like something this:
I recommend testing your setup at this point. Back in your SAP S/4HANA do whatever you need to do to fire your event. In our case I will change the status of a WBS element. In your SAP S/4HANA you have /n/IWXBE/EVENT_MONITOR transaction for monitoring events. Here you will find undelivered or acknowledged events, if something doesn't go as expected.
If you then go back to your Even Mesh Message Client and the queue you created, after you fired an event it should sit waiting here (you can see I tested a lot - I have right now 12 triggered events in my queue that no-one consumed yet):
If you go to the Test tab, you can test both Event Consumption and Event Publish directly on SAP Event Mesh. For example this can be used if you don't want to have to go back to your SAP S/4HANA every time you wanna do a quick check; you could instead publish a message directly on Event Mesh, thus simulating your event-driven integration.
If everything is working, our event-driven architecture is ready.
To finish our end-to-end scenario, we need a consumer. For this showcase I chose SAP Build Process Automation (short SBPA). We will receive the WBS element status changed event from SAP Event Mesh and then a process starts where I read more detailed information about that WBS element and then potentially create a Sales Order (if certain rules are passed) with an item, which uses this WBS element (please bear in mind this is a demo scenario, trying to be a bit out of the box here, but basically you could have any kind of process or approval workflow built at this point).
I won't go into details on how to subscribe to SBPA or how to create an instance with a service key, assuming this is ready. But all these 3 are prerequisites: subscription & instance to SBPA & a service key on the instance.
First and foremost, SAP Build needs to be able to react to the event. In your SAP Build Lobby, open the Connectors menu and click on Events. Next click on Create. Personally I find it easiest to use the SAP Business Accelerator Hub again to find the same event, or Unified Customer Landscape if you are using that. Lastly we can also upload the event specification (for example if you have a derived event or other custom event). For this showcase, I go with the HUB so I am clicking on that. A search appears and from our previous search we know we want "Project" so I enter that (I do not recommend using the string "events" in your searching, since many have that string in their name and thus the search result may be too big to find the one you actually want):
In the lower right corner is our event object. Click on it. On the next screen you can see all the events that are offered in that object. If everything looks good, click on Add. Open your new Events project and we see once more all events we are capable of subscribing to. The last step is to release and publish this Events project, so it can be used across SAP Build.
Because notification events are very small, we usually need APIs to retrieve some extra information. In our case we also need an API to finally create the Sales Order. In SAP Build, we do this by creating Action projects. Now once more I won't go into super detail here, this has been explained many times over and an example can be found here: Create Sales Order Action Project in SAP Build.
For my specific case I would need 2 action projects:
To find the appropriate APIs I can again use the HUB or other ways; I have a full blog post about this: How to find/create suitable APIs.
Now that the event I want to react to and all my needed APIs are published in SAP Build, I can create my process.
Go back to the Lobby and create a new project: choose Build an Automated Process and then Process. Give a name for your project and then also for the process. In my case I chose "Sales from PS Project" as my project and "Start from WBS Element Status Changed" as my process name (my general recommendation is to choose good names that clearly state what the artifact is doing).
In my process, I now add an Event trigger by clicking the "Add a Trigger" link at the Trigger element. Then I select Wait for an Event:
In the next popup I now need to select my Event Project, and lastly the exact event (in my case WBS element status changed event) that should trigger this process. After this I created my process steps, as shown below (explanation also below) :
Let's go through above step-by-step:
As the very last step, we need to get back to SAP Event Mesh. As of right now, SAP Build is aware of the event object Projects and we have created a trigger for one of those events. But so far, we have not connected SAP Build with SAP Event Mesh, meaning nothing yet calls our trigger. For this we need a webhook: an API for SAP Event Mesh to call, when a certain topic is received in the event broker. This is the actual trigger of SBPA and our process.
Before we enter open Event Mesh, we need the webhook URL, the API that Event Mesh needs to call to access your SBPA instance. Similar to how we retrieved the service key for SAP Event Mesh, we now need to do the same with SBPA. Go to your subaccount and Instances and Subscriptions. This time filter for SAP Build Process Automation and find your instance:
Click on your instance and View your Service Key (create first, if none available). Keep this info open as we proceed to Event Mesh.
In another tab/window, go to your SAP Event Mesh Message Client, click on the tab Webhooks and choose Create Webhook. Give it a name and select your queue. Leave the other fields as is. Now for the Webhook URL, go back to your SBPA service key. One of the very first parts is the endpoints-api:
Note: your URL may look differently! Copy the URL from your service key.
Go back to the Create Webhook dialog and paste the URL into webhook URL field. Add the following at the end of the URL: /internal/be/v1/events
Next, for Authentication choose OAuth2ClientCredentials and enter the following info, again taken from the service key of SBPA:
Save the webhook. Now you can trigger the handshake between SAP Event Mesh and SBPA to see if you entered all info correctly. Finally make sure your subscription status is active and everything is running. It should look something like this:
And now we are done and we have an event-driven automation that takes care of creating sales orders for us.
Events can be used in many ways, allowing us to react when something happened, and then we can do almost anything: run some code locally in our system, have a bot running for us (as in my case here - but this bot could take many different forms), trigger other integrations to inform of the change, trigger workflows, and much, much more. So next time you want "something to happen, after something happened" check out the new event framework.
Hope this helped and let me know if you have any questions.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 148 | |
| 44 | |
| 40 | |
| 20 | |
| 14 | |
| 13 | |
| 12 | |
| 12 | |
| 9 | |
| 9 |