CRM and CX Blogs by SAP
Stay up-to-date on the latest developments and product news about intelligent customer experience and CRM technologies through blog posts from SAP experts.
Showing results for 
Search instead for 
Did you mean: 
Efficient pfunctions are key to offer users an optimum user experience in a solution configurator. How efficient are yours? If you find hard to answer this question, this post will help you to get insights into your pfunctions performance and track them as you operate your solution configurator in SAP Commerce.

Track performance in Solution Modeling Environment

You should ideally start measuring the performance of your pfunctions during their development. The Profiler, built-in the Solution Modeling Environment testing perspective, measures the total and average time taken by the engine to execute rules and help to identify easily pfunctions with a long execution time:

  1. Open your Knownledge Base (e.g. via menu item File > Open Knowledge Base)

  2. Select the Profiler view usually located in the bottom right panel and start profiling

  3. Update your configuration to trigger the rules and associated pfunctions you want to verify

  4. Stop profiling and look for rules with a high execution time

In the following example, the rule BAD_PERF_RULE was invoked twice during the profiling session for an average execution time of 1 second. In the last column, you can find the name of the pfunction associated to the rule, BAD_PERF_FUNCTION in the present case.

Example of profiler result with an inefficient pfunction

While this pfunction clearly suffers from a performance problem, the number of invocation can often be responsible for performance problems as well. You should then carefully review the objects and condition clauses in your rule to reduce the number of invocations to the strict minimum.

Once you found the pfunctions suffering from performance issues, you might need more insights to identify the code causing the high execution time. You will need for that a Java profiler. While other profilers can be used, in various setup, I will focus on SAP JVM Profiler installed directly in SME.

For those who are not familiar with it, the SAP JVM Profiler is an Eclipse plug-in profiling any Java code running on SAP JVM 8.1. You can install it side-by-side with SME. You can download the profiler from SAP Software Download Center by searching for SAP JVM SUPPORTABILITY TOOLS package. Note that you might only be offered a download for Windows on X64. If you run on a different platform like me, don't worry. It worked for me on Mac OS X. Once you downloaded the ZIP, go to Install New Software, add the ZIP as local archive and install the plugin. For reference, I downloaded and installed the patch 47 on Eclipse 2019-12 with SME 3.2.24. Once installed, you should have a new perspective called Profiling.

Since the SAP JVM Profiler can only profile Java code running with SAP JVM 8.1, you will need the SAP JVM 8.1 to be installed as well. You can download it from SAP Software Download Center, searching for SAP JVM 8.1 package. Ensure to select the right platform as it matters this time! The SAP JVM Profiler itself does not necessarily need to run on SAP JVM 8.1 but it is recommended. If you want to configure Eclipse to run SAP JVM 8.1, add the following lines to your eclipse.ini.
<path to SAP JVM 8.1>/bin/java

Do not forget to set the environment variable runtimeEnvironment to standalone as you will otherwise face runtime errors when running SME! Ensure then to add the SAP JVM 8.1 to the list of installed JREs in Eclipse preferences. Create finally a new debug configuration for an Eclipse Application by setting the following parameter

  • Java Runtime Environment: select the SAP JVM 8.1 installed JRE

  • VM arguments: add -DruntimeEnvironment=standalone

Switch to the Profiling perspective and launch the new debug configuration. You should then normally see the newly started Eclipse in the VM Explorer view. If you don't, ensure you configured properly the runtime environment in your debug configuration as the SAP JVM Profiler can only profile programs running on SAP JVM 8.1. Connect to the Eclipse process and select Performance Hotspot Analysis. I recommend including sleeping times.

Profiling perspective once connected to Eclipse application started from the debug configuration

The profiler is now running and profiles any code executed in the newly started Eclipse. Switch to the SME Testing perspective, open your knowledge base and re-execute the configuration steps to get the pfunctions invoked. Stop finally profiling and open the snapshot to analyze the results. I recommend starting with the Method (Flat) report and filter by the name of your pfunctions.

Example of Method (Flat) report

The Method (Flat) report provides generally very good insights on inefficient methods, like the badLogic() one in my example. However, you might need sometimes to dig deeper, especially when the code is not only located in your pfunction classes. In that case, right-click on the execute() method of your inefficient pfunction and select Show Methods (Hierachical). While pressing the Shift key, expand the call tree to see the most time consuming method sub-invocations.

Example of Methods (Hierarchical) report limited to the pfunction execute() method

This report should give you ultimately the details you need to find the inefficiencies in your pfunction implementation to address then. Once you identified the code requiring optimization, I recommend you start writing a unit test for the pfunction. First, the unit test will help you to ensure that the code changes you will perform to optimize your implementation won't introduce functional regressions. Secondly, you can quickly identify if the performance problem is addressed by comparing execution times of your unit test before and after. Third, by executing the unit test on a regular basis within your pipeline, you can track the performance and ensure performance regressions are not added to your pfunction. It's definitely an investment, especially the first time you go through this process, but it pays off.

Track performance in SAP Commerce

You might wonder why it could even be needed to measure and track pfunctions performances in SAP Commerce after analyzing thoroughly their execution in the previous section. Well, we analyzed the pfunction performance with a single user, in an environment where CPU and memory resources were available and the pfunction implementation could run at it best. When the pfunction will run in SAP Commerce, it will be a different story and it could impact the performance of your pfunction implementation. I say could because it comes down to your pfunction implementation. For example, simple pfunctions performing basic transformation like changing the value of a characteristic to uppercase, will not suffer from more users and consequently parallel executions. But a more complex pfunctions featuring a cache with synchronizations could suffer from longer execution time as the number of parallel invocations increases. Therefore tracking performance of your pfunctions in SAP Commerce is important and can reveal performance problems you would not have been able to detect previously.

This time, SAP does not offer any out-of-the-box solution and we need to build our own, ideally generic enough that it can be reused through projects and does not need to be reconfigured every time a new pfunction is developed. That's possible using AspectJ since pfunctions classes must be under the package and they must implement the interface sce_user_fn. Here is an example of aspect implementation you can use:
public class MyPfunctionPerformanceAspect {
private static final Logger LOG = LogManager.getLogger(MyPfunctionPerformanceAspect.class);

public void pfunctionImplementation() {}

public void pfunctionPackage() {}

public void pfunctionExecuteMethods() {}

@Around("pfunctionPackage() && pfunctionImplementation() && pfunctionExecuteMethods()")
public Object aroundPfunctionExecution(final ProceedingJoinPoint pjp) throws Throwable {
final long startTimestamp = System.nanoTime();
final Object result = pjp.proceed();
final long endTimestamp = System.nanoTime();
final long durationInMicroseconds = (long)((endTimestamp - startTimestamp) / 1000.0D);
if (durationInMicroseconds > 0L) {"Pfunction {} took {} µs", getPfunctionName(pjp), durationInMicroseconds);
return result;

protected String getPfunctionName(final ProceedingJoinPoint pjp) {
if (pjp.getTarget() != null) {
return pjp.getTarget().getClass().getSimpleName();
} else if (pjp.getThis() != null) {
return pjp.getThis().getClass().getSimpleName();
} else {
return "N/A";

The AspectJ configuration (aka. META-INF/aop.xml file to place under the resources folder of your SAP Commerce extension) should look like this:
<!DOCTYPE aspectj PUBLIC "-//AspectJ//DTD//EN" "">
<include within="*"/>
<include within="mypackage.MyPfunctionPerformanceAspect"/>
<aspect name="mypackage.MyPfunctionPerformanceAspect"/>

Once in place, you shall see in your console logs message like these ones:
INFO  [hybrisHTTP2] [MyPfunctionPerformanceAspect] Pfunction MY_PFUNCTION took 40 µs

If you are running in SAP Commerce in Public Cloud or you have Kibana or SumoLogic connected to SAP Commerce, build your own query to extract the pfunction name together with the execution time and setup your dashboard to track min, max and average execution times. It will very easy to detect anomalies and perform root cause analysis.


Knowing how well or bad your pfunctions impact your configurator isn't really hard but crucial to keep a hand on the performance and offer the best user experience to your customers. You have hopefully now everything in your hands to deliver efficient pfunctions. Do not forget to keep an eye on your knowledge base to ensure rules are not invoked more than they should as it is another important factor in the performance.