Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Active Contributor

In Part-1 of Splunk Blog, we saw how to use Splunk HTTP Event Collector (HEC) and JSON standard source type to log SAP API interaction to Splunk Cloud. You can follow the same blog steps to implement a real-time JSON HTTP Event logging from SAP CPI during Interface message exchange(like from Exception Process for example). I won't cover this since it's self-explanatory from Part-1 and just have to replace SAP APIM Step with an HTTP Request/Reply Call in SAP CPI.

But in this Part-2 we will see how to take the SAP CPI standard MPL that in most cases have enough information of runtime execution history into Splunk Cloud and get benefit from it.

Power of MPL

SAP CPI Message Processing Log contains structured information on the processing of a message. Read here about the MPL properties and their description. There is certain property that is not set by default and needs to be handled from Interface as listed below.

Properties How to Set Purpose
CustomHeaderProperties From Groovy Script. Save value from the payload in addition to an Application ID.
Id From SAP_ApplicationID Header Save values like IDoc number, Customer / BP Number exchanged in that interface
MessageType From SAP_MessageType Header Business Object of Interface
ReceiverId From SAP_Receiver Header Receiver Application of the Interface
SenderId From SAP_Sender Header Sender Application of the Interface

These are optional but having an MPL enriched with these properties will make it complete.
//add custom header property
def messageLog = messageLogFactory.getMessageLog(message);
messageLog.addCustomHeaderProperty(String name, String value);


  1. Create a new Source Type in Splunk for SCPI MPL

  2. Create an Index in Splunk for SAP CPI

  3. Create an HEC in Splunk for SAP CPI

  4. Implement a Scheduled IFlow to extract MPL and log to Splunk

1 Splunk - Create Source Type

The source type controls how Splunk formats incoming data and indexes with appropriate timestamps and event breaks. This facilitates easier searching of the data later. Splunk comes with a large number of predefined source types. JSON is one such predefined source type that was used to log API Interaction as shown in the previous blog. JSON has a fixed structure and event timestamp is taken from "time" string value pair.

However, for MPL, we will create a new Source Type so we can the MPLs JSON representation as-is to ingest into Splunk and use the MPL LogStart value as the event timestamp in Splunk.

To Create new Source Type Open Settings --> Data --> Source Types --> New Source Type.
Create a Source Type scpi_mpl as shown below
Indexed Extractions : json
Timestamp format : %Y-%m-%dT%H:%M:%S.%3Q
Timestamp fields : LogStart

2 Splunk - Create Index

Create an Index called scpi_dev as shown below. For detailed steps read Part-1 of this blog.

3 Splunk - Create HTTP Even Collector

Create an HEC for SAP CPI as shown below. Enable and retrieve the HEC token after creation.
Again, for a detailed step-by-step guide read my previous blog.

4 SAP CPI - IFlow to Extract MPL and Index to Splunk

Create an Iflow with below steps

Scheduler to run IFlow periodically
Read Last run Timestamp from Local Variable and set it as the start time for extraction
Set the current Timestamp as the end time for extraction

Making a looping Process Call.

Server-side pagination limit results to 1000 entries per API call. Hence we loop through the result.
Read MPL from ODATA API.
URL : https://<tmn_host>/itspaces/odata/api/v1
Convert XML to JSON
Extract MPL array list from ODATA response and set as the message body Groovy Script given below the table.
Set Splunk HEC token as Authorization Header
Call the Splunk Raw Event API with a Channel Identifier and Event Metadata as Query parameters
Overwrite the Local variable with the current run timestamp from property variable

import java.util.HashMap;
import groovy.json.*;

def Message processData(Message message) {
def body = message.getBody(;
def inputJSON = new groovy.json.JsonSlurper().parse(body);
def builder = new JsonBuilder(inputJSON.MessageProcessingLog);
return message;


The MPL entry is sent periodically as RAW data to Splunk and indexed. Use these events to

  • Write Reports and produce Dashboard of Interface runtime status.

  • Search MPL in Splunk for analysis. View as table and download.

  • Send Alerts for Events that meet a certain condition like failed Messages, No message for a certain interface today etc. Send Alerts as EMail / Webhook Call / JIRA ticket etc

Once there is data, the result is purely dependent on the capability of the platform and our creativity in utilizing it 🙂
Labels in this area