Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
This blog is part of a blog series, so you can find the first page here ( This is the agenda we’re following:

Automated Testing

Coming from a JAVA development background, automated testing was always a part of the development. I was very surprised to realize that there are no standard tools in regards to automated testing for your iflows. When talking about it on scope of integration, I'm sure there are a lot of types of tests you can do but I can immediately think about:

  1. E2E testing of your integration using mocked scenarios

  2. Performance Tests

  3. Unit testing (main focus on this page)

E2E Testing

General Tools

Regarding E2E testing, there are several general testing products (e.g. Tosca tricentis, HP ALM) that do a good job allowing you to supply your interfaces with predetermined data and asserting some outputs from your API. This tools work perfectly fine if your interface is isolated with no dependencies to external adapters, meaning totally under your control. However, if your interface uses an external API inside of it and you need to mock it in your test, then you get into a problem since these products can't interfere with the logic inside your iflows by mocking up the adapters you're using.

Figaf Testing Tool

When doing some investigation, I found Figaf Testing tool for CPI that seems to be an awesome alternative allowing to mock up your adapters. If I reflect on it, this is likely a copy of your original iflow but with the adapters endpoints being replaced with endpoints to a fixed controlled url where you can inject your mocked body simulating a response from that adapter, but I hope someone from Figaf can comment on that daniel.graversen. It has a UI5 cockpit that lets you manage your test cases allowing you to record your test cases by leveraging trace mode on your iflow. This looks very interesting as you don't need to manually create your test case data. You can then run it later on and assert for each step that no regressions were introduced. More details on the following link . Never tried it but looks promising. I invite you daniel.graversen or anyone else to provide more details about it in the comments.

INT4 Shield

Int4 has also a similar SAP certified solution Int4 Shield (as a part of their Int4 Suite) to test CPI packages but as part of a broader product that also lets you test SAP PI/PO and also the integration the backend systems (ESR Proxies, IDOC, etc.) it also has eCATT, Tricentis, Worksoft integrations and so on. The cockpit is installed and configured on SAP S/4 backend (as an SAP certified addon) and in the end is also leveraging the trace mode to capture payloads across the starting and endpoint configured on your automation object. You can then run the test cases from the cockpit and evaluate the results there. More details on week 3 unit 5 from this open sap course I also invite you @Michal.Krawczyk, @Andrzej.Halicki or anyone else that tried the tool to provide more details about it in the comments.

We're currently evaluating Tosca, since we already have the licenses that were acquired on scope of another products, but we haven't focused on E2E testing so far on the Integration team

Performance Tests

I have no experience with performance testing specifically for integration. I've used HP Load Runner and JMeter in the past for general web testing and I think they are suitable to test/simulate multiple requests at the same time. So far, we didn't have the chance to do performance tests on our interfaces. If you have done something in this area, provide your insights on the comments.

Unit Testing - Groovy and XSpec

Found nothing on this topic, but this was the one that I thought could bring more value to the team since it's fully controlled by the developers, you shouldn't have false positives or any dependencies when talking about unit testing. Our pipeline per package was doing some checks (like cpilint checks), so that seemed like the perfect pipeline to enhance to do automated unit testing as well. More details about the steps below. One key aspect that I wanted to guarantee is that the unit testing reside inside the iflow artifacts. If you move/copy/deploy/git your iflows, makes perfect sense that your tests go with it  as a unit. Below you can find the diagram of the steps we follow on our pipeline.

Unit testing steps

1. Extract and 4. Execute groovy unit tests

The pipeline starts by downloading all your groovy files. For unit tests, we decided to have dedicated groovy test classes and as naming convention we decided to use the UnitTest.groovy suffix, e.g. for a StringUtils.groovy, you could have a StringUtilsUnitTest.groovy file. This was only imposed so that we can clearly see which class is testing what, but from technical perspective this is really not mandatory since jUnit is clever enough to go through a path and execute all classes with methods annotated with @Test.

Value added

Allow to add junit tests for your groovy classes making sure they are executed upon pipeline builds. If you have a mapping that rely on groovy script, you can unit test it and make sure it is returning the expected values. 

2. Extract and 3. Execute XSpec

A colleague mentioned that it would be pretty cool to also allow to execute unit tests for XSLT since we also use it in our mappings. After some investigation, I found XSpec and incorporated that into our pipeline. Unfortunately, you cannot add any file you want to your iflow via the Cloud Integration UI (only some extensions are allowed), so I had to make the workaround of trying to find both files ending in *.xspec or *.xspec.xsl so that you can add them via the UI and they are stored on your project.

We execute all xspec unit tests inside the iflows from the package we're evaluating and report via junit reporting on jenkins

Value added

Allow to unit test your XSLTs and report failures on your pipeline per project on jenkins.

5. Generate jenkins reports

If any test fail, we report it into Jenkins reports and the build fails.

Value added

Allows to track your unit test execution, see how many tests were executed, how many failed, and lets you drill down into details from Jenkins UI

5. Email notifications for the responsible developer

If any test fail, we report it also via email to the responsible developer together with the details on the email body

Unit Testing - Message Mappings

On our integrations, message mappings is our preferred mapping mechanism. It allows you to map from a UI graphical editor and it is a technology mature enough that AFAIK is coming from PI development days. Moreover, it allows you to interact with custom groovy scripts for more sophisticated mappings or with standard functions for the most common scenarios. Your interface can fail at many points, but mapping is definitely one of the areas where we can have more regressions. Therefore we had to think about testing the message mappings.

1st approach - Use offline testing

With the help of the blogs from vadim.klimov, I downloaded the karaf libraries from our tenant. In the end, you can download the files from your tenant either using groovy script that downloads files as Vadim shows on his blogs or via a reverse shell that you can interact via netcat connecting your local PC to your tenant. Idea was to identify the sap jar on your tenant container responsible to do a message mapping execution and invoke it via a custom program. This idea was great since you could invoke it offline with needing CPI, on the other hand it was quite dangerous since this was not an official approach from SAP, therefore SAP could change the jar or approach later on and your unit tests would stop working. Also, I believe java code is generated for every mapping you design and this java generator class is not part of our tenant list of jars, but if someone has investigated that please let me know in the comments

2nd approach - Use Cloud Integration runtime to test your message mappings

Idea was to have a dedicated iflow deployed containing only your message mappings as well as all the dependencies and execute this dedicated iflow to assert test results. Here's  how it works

Message mapping unit tests steps

1. Extract message mappings

The package pipeline is looping each iflow and is extracting the list of all the message mappings contained inside it as well as all the resources that are potentially used inside of it.

2. Deploy dynamic iflow

We have a template iflow, containing only a message mapping example (that is replaced upon pipeline execution) and a step to translate headers back into properties.

Dynamic iflow

Idea is to replace the message mappings and dependencies and to invoke this dynamically created iflow with the headers, properties and body and get a result from it. Since http doesn't support the concept of "properties" we need to translate that from supplied headers when doing the invocation. The name of the generated iflow contains a prefix, the name of the original iflow, the name of the message mapping so that we guarantee that there are no name clashes.

3. Change endpoint

Again to avoid name clashes, we change the endpoint to a unique name on the iflow design object. After this we deploy the iflow

4. Call the dynamically generated endpoint

We call the dynamically generated endpoint with the properties, headers and body supplied by the developer on the groovy unit test

5. and 6. Evaluate results

We use XMLUnit to make sure the xml results are not jeopardized by whitespaces and we finally compare the expected xml with the actual xml result. Limitation here is that we don't support json on your message mappings, we only support xml so far.

7. and 8. Cleanup

After execution, we can undeploy and remove the dynamically created iflow. We later optimized this step so that the Prepare Phase and Teardown Phase is not executed unless the iflow under test was changed, since we have around 160 iflows containing message mapping unit tests and having this setup/teardown process every day doesn't make sense if the iflows under test don't change.

Example of a groovy test, testing message mappings

All steps described above are part of the package pipeline that is executed automatically so all this logic is really hidden from the developer. From developer perspective, the only thing he needs to know is that if he places unit tests on the resources of the iflows they would get automatically executed on a daily basis. So for message mappings for instance, the only thing the developer needs is a class such as below inside of the iflow resources.
import FormatNumber;
import org.junit.Test;
import static org.junit.Assert.assertEquals;

class Test001_SystemSubscribeMM_MessageMappingUnitTest implements Serializable {
public String getBody() {
def body = """<?xml version="1.0" encoding="utf-8"?>
<Created>20211103 10:20</Created>
return body

public Map getProperties()
def result = [:];
return result;

public Map getHeaders()
def headers = [SQL_Action_Command: "INSERT", SQL_Table: "FER_Table_Target"]
return headers;

public Map getExpectedHeaders()
def headers = [SQL_Action_Command: "INSERT", SQL_Table: "FER_Table_Target"]
return headers;

public void testFormatNullableNumberNotNull()
def numberTestFormat = new FormatNumber();
def result = numberTestFormat.FormatNullableNumber("4.321");
assertEquals("Result in this case should not be empty since argument is a valid number","4.321",result);

public void testFormatNullableNumberNull()
def numberTestFormat = new FormatNumber();
def result = numberTestFormat.FormatNullableNumber(null);
assertEquals("Result in this case should be 0 since argument is null","0.000",result);

public String getExpectedBody()
def expectedBody = """<?xml version="1.0" encoding="UTF-8"?>
<root><StatementName1><dbTableName action="INSERT"><table>FER_Table_Target</table><access><DUMMYCATEGORY>123</DUMMYCATEGORY><FAKE_CATEGORY>111</FAKE_CATEGORY><CUSTOMER>FakeCustomer</CUSTOMER><ENTERED_DATE>2021-11-03 00:00:00.000</ENTERED_DATE><QUANTITY>3.0</QUANTITY><CREATION_DATE>2021-11-03 10:20:00.000</CREATION_DATE><NOTES>FakeNote</NOTES><CREATION_SOURCE>FakeCreator</CREATION_SOURCE></access></dbTableName></StatementName1></root>""";
return expectedBody;


  • getBody contains the body payload that will be injected into the dynamically created iflow

  • getProperties contains a map of key value properties that might be expected by your message mapping and that will provided when calling the message mapping

  • getHeaders contains a map of key value headers that might be expected by your message mapping and that will provided when calling the message mapping

  • getExpectedHeaders contains a map of expected headers that shall be present in the end. If this list is empty, no check on headers is applied

  • getExpectedBody contains the expected payload body resulted from calling the dynamic iflow

  • @Test methods are just there to illustrate that you can also do other regular unit tests as part of the same class if you want.


In this topic, I've introduced the topic of automated testing, presenting some of the tools we've researched as well as the chosen solution to perform automated unit testing via jenkins.

I would also invite you to share some feedback or thoughts on the comments sections. I'm sure there are still improvement or ideas for new rules that would benefit the whole community. You can always get more information about cloud integration on the topic page for the product.
Labels in this area