Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 

Hitting the benchmark

In the fast-paced and ever-changing landscape of software development, SAP Fiori tools helps streamline and improve the efficiency of developing SAP Fiori applications. It provides developers with the right support at the right time in a user-friendly manner. However, without regular benchmark usability testing, it can be difficult to know where the product stands and where to take it next.

What is UX benchmarking? How does it help?

The term benchmarking can mean many things to different people. For developers, benchmarking usually relates to software performance. This could, for example, be how many transactions per second are performed or how much memory the software uses. UX benchmarking isn’t much different, except we are looking at the performance from the user’s perspective. For this, we use three criteria: effectiveness, efficiency, and satisfaction.

  • Effectiveness is usually measured by looking at things such as task completion or how often does a user need help. Software that doesn’t allow the user to complete their task isn’t considered to be particularly effective.

  • Efficiency typically relates to how quickly a user is able to complete a given task. The faster a user completes a task, the more efficient we consider the software to be.

  • Satisfaction relates to how satisfied a user is with the software to complete a given task. If users are not satisfied with the software being used, it is unlikely they will want to use the software again. Or, even worse, they may switch to a competitor’s software.

So, with these criteria in mind, how does Benchmark usability testing help? Benchmark usability testing helps us primarily in three different ways:

  1. It allows us to measure and track outcomes of our earlier findings.

  2. It helps us identify new requirements for our software and identify pain points.

  3. It helps us understand the needs of our developers.

To date, we have measured the user experience of three different SAP Fiori elements integrated development environments (IDEs). This includes the SAPUI5 Visual Editor  and of course, SAP Fiori tools. The purpose of evaluating these different products was to compare and contrast the various user experiences and to ensure that, as we develop SAP Fiori tools, we are continually providing a superior user experience. Any dips in the KPIs for the criteria above means that we can quickly react and correct these issues.

How do we benchmark?

During a typical session, we will ask our participants—experienced SAPUI5 developers—to complete a series of common tasks required to build a basic SAP Fiori elements application. This includes tasks such as generating a project and adding a variety of features to the application, for example, adding a column or a smart bullet micro chart to a table.

Benchmarking Process

These tasks are given to the participant one at a time. While the participant is working on the task, we are looking at things such as the time it takes to complete a task, how often the participant needs help from the moderator, and where the participant most struggles with the tooling. We also look at whether or not the participant completes the task successfully. Collecting this feedback allows us to determine how effective and efficient the tooling is.

Is it usable?

To judge overall usability, we ask each participant to complete the UMUX-lite, which is a two-question, industry-standard survey. For each task, the participant is asked questions related to the ease of use of the tooling and whether tooling supported them to complete the task. Finally, we ask the participant to give an overall rating for the tooling, based on the same survey. This allows us to rate both the tasks and the product as a whole.

So, after all this testing, what have we learned so far? The important takeaway from these activities is that SAP Fiori tools provides a good user experience. Even though it took the participants a bit longer to complete many of the tasks, they made fewer mistakes while implementing features into the application they were building. This translates into shorter overall development times because less time will be spent on bug fixes and customer tickets.

We have also seen that for particularly complex tasks, SAP Fiori tools excels. One of the tasks we have been asking participants to do is to implement a smart bullet micro chart, which is not the easiest feature to implement. Based off of our usability testing, we expect that an inexperienced developer trying to implement this feature manually in the past can take as long as 5 hours. Using the Guided Development extension, the time it took users dropped to under 10 minutes!

Moving forward with SAP Fiori tools

As mentioned earlier, we also use this testing to find usability issues in our tooling. For instance, when we did our initial usability test of SAP Fiori tools, we identified several major usability issues. One thing that we began to notice during the testing was that, in the Guided Development guides, it was challenging for users trying to navigate through multiple pages. Users would often skip over filling in mandatory fields on a given step before moving onto the next page. The end result was users who couldn’t complete the guide correctly. After observing this issue with many users, we worked with the designers and developers to find a new design to better support the developers. When we retested the same guide again with the new design in July of this year, users breezed through the guides with ease.

If no entity type selected, then the corresponding fields below are disabled

Where to next?

We plan to continue our regular benchmark and usability testing for SAP Fiori tools, as it is a key part to our commitment to increase developer efficiency and overall usability of the tooling. If you are interested in providing your feedback or participating in one of our usability test sessions, look out for announcements in one of our roundtable sessions or in the release notes.