Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
MartinKolb
Advisor
Advisor
24,544

A particular focus of Design Studio 1.5 development was to improve performance.

This document describes many aspects of performance improvements, which are divided into:

  • The Free Lunch
    New performance improvements, which you get automatically when upgrading to Design Studio 1.5

  • The Exciting New Features
    New performance-improving capabilities, which you can profit from when creating new and modifying existing applications

  • Best Practices
    New and updated guidelines, which you should apply to your new and existing applications

  • Getting It Even Faster
    New tools, which you can use to analyze performance problems and make your application faster

Hint: You will find sections starting with "Before Design Studio 1.5", which will point out how things behaved up to Design Studio version 1.4. The new stuff will be introduced using "With Design Studio 1.5"

The Free Lunch



Improved Browser Caching Strategy


Before Design Studio 1.5 browser caches were tied to a specific server node. Thus, in system landscapes with multiple nodes the browser cache had to be built up several times – one time for each node. With Design Studio 1.5 the caching strategy has been improved to support a single cache, which is shared among several server nodes. This results in an improved startup time of the application, because the browser cache filled by the first node is reused when other nodes are accessed.

Before Design Studio 1.5 browser caches were invalidated when the server nodes were restarted, because during server downtime, a server update might have been installed. With Design Studio 1.5 the system can distinguish between a simple server restart and a server update. The system invalidates the browser cache only if a server update has been applied. This results in an improved startup time of the application, because the browser cache does not to be refilled after a simple server restart.



Reduced HTTP Requests


Before Design Studio 1.5 the associated JavaScript module for each component type used in the application was loaded in a separate HTTP request. With Design Studio 1.5 the most relevant JavaScript modules of the components were combined into a single module.

Before Design Studio 1.5 the authentication method at application startup required multiple HTTP requests. With Design Studio 1.5 the application startup sequence has been redesigned and the number of HTTP requests has been significantly reduced.

Both improvements result in an improved startup time of the application, especially in high-latency scenarios like WAN, because of the reduced number of HTTP requests.



Improved Startup Time of Application When Executed on Server


When an application is started on the server (that is, not with "Execute Locally") the startup time is significantly influenced by the number of Java archives (JAR files) on the server, which need to be searched during application startup. The startup time is the longer the more JAR files are present. With Design Studio 1.5 the lookup strategy has been improved. This results in a faster application startup.



The Exciting New Features



Parallel Query Execution


Before Design Studio 1.5 the data sources of an application were executed in sequence. With Design Studio 1.5 the application developer can decide which data sources are executed in parallel. Note that executing queries in parallel comes at a price. Executing data sources in parallel requires multiple sessions. Each session consumes resources on the server that stores the actual data, for example the BW system. This is the reason why queries are not executed in parallel by default.

With Design Studio 1.5 the application developer can define groups of data sources ("processing groups"). Each of these groups can be executed in parallel. Each group is associated with a session. Note that variables of separate sessions cannot be merged. If the application needs both parallel query execution and variable merging then there are new Design Studio script methods that can emulate variable merging behavior.

For example, before Design Studio 1.5 an application of 5 data sources, with each data source taking 1 second to initialize, took about 5 seconds for data source initialization. With Design Studio 1.5, by placing each data source into a separate group, one would expect the application to take only about 1 second for data source initialization, but of course you need to add some overhead for managing separate sessions and handling and synchronizing the parallel execution to arrive at the actual figure.

The most significant performance improvement of parallel query execution is during application startup. In addition, the performance of the application is improved whenever the result sets of data sources are retrieved, for example during rendering, after applying filters, after changing the drill-down, and so on.



Unmerge Variables / Unmerge Prompts


Before Design Studio 1.5 variables of data sources are merged in a "variable container".

This has the following advantages:

  • Setting a value of a variable that is shared among data sources, can be done with a single Design Studio script method call instead of setting the variable value for each data source separately.
  • The shared variable appears only once in the Prompts dialog instead of several times.

However, these advantages come at a price in performance:

  • Setting a variable value invalidates all data sources associated with the variable container. It even invalidates data sources that do not contain the variable that was set.
  • Initializing a data source with variables during the flow of application use invalidates all data sources associated with the variable container.

When data is retrieved from an invalidated data source, for example during rendering or during a Design Studio script method call, then the data needs to be reloaded from the backend. This obviously reduces the performance of the application – the more the larger the number of data sources with variables.

With Design Studio 1.5 the application developer can disable the above variable merge behavior with the application property "Merge Prompts". Reasons to do so could be that the application developer wants to deliberately set different variable values for variables of the same name but of a different data source or for performance reasons, because setting a variable value in one data source or initializing a data source does not affect (invalidate) other data sources.



Best Practices



Using „setVariableValue“ or „setVariableValueExt“


When setting several variable values with „setVariableValue“ or „setVariableValueExt“ Design Studio script methods, write these commands in one direct sequence, one after the other, without any other Design Studio script methods in between. This sequence is folded into a single backend call to submit variable values instead of multiple ones, improving application performance.



Using „setVariableValue“ or „setVariableValueExt“ at Application Startup


When setting variable values at application startup, prefer to set the variable values in the "On Variable Initialization" script instead of the "On Startup" script.

During the "On Startup" script the variable values have been already initialized using their default values or with values entered in the Prompts dialog. Setting a new variable value at that time will invalidate the associated data source, resulting in reloading the data. Setting the values in the "On Variable Initialization" script, which is executed before the "On Startup" script, avoids this issue.

Do Fewer Data Sources Always Improve Performance?

In general, it is true that the more data sources are initialized at startup the less performant the application is at startup. Note, however, that data sources that are intentionally not initialized at startup (with the data source property "Load in Script = true") do not decrease performance. For example, 10 data sources with "Load in Script = true" will not decrease the startup performance of the application. Even removing them does not improve performance.

Performance of Universe Access

While BW and HANA data sources provide direct multi-dimensional data access, Universe data sources are organized relationally. When accessing a Universe in a multi-dimensional manner, there is an overhead to transform the relational data into a multi-dimensional form. The overhead is the larger the more data is processed. In general, to achieve best performance, it is recommended to load frequently accessed relational data into a BW or HANA system.

Background Processing Is Not For Free

Keep in mind that background processing is not a replacement for parallel query execution. Background processing is intended to improve the perceived performance at the price of a slightly decreasing overall performance.

Every background processing step adds a performance overhead, so aim for a minimum number of background processing steps. For example, with an application of 10 data sources do not use 10 separate background processing steps – each one initializing a single data source. Better pick and group, for example, the 3 most important data sources in one background processing step and the 7 remaining ones in another background processing step.

In particular, do not consider using background processing if the application has no performance problem at all. For example, if you use background processing with multiple data sources just for the effect of loading animations, disappearing one by one, the price you pay for the effect in performance is not worth it. In such cases, consider dropping background processing altogether resulting in your data being loaded more quickly.

Getting It Even Faster

Crosstab and Pixel-Based Scrolling

By default the Crosstab component scrolls only in units of entire cells (like in Microsoft Excel). Additionally, to achieve best performance, only the data for the displayed cells are sent to the browser. However, you can activate "pixel-based scrolling" at a Crosstab component, mainly aimed at touch devices, which offers a more fine grained, pixel-wise scrolling. To achieve this kind of scrolling, however, it is necessary to send the data of all available cells to the browser. Obviously this will work for a limited number of cells only.

With the growing number of cells this feature decreases performance in several ways:

  • The number of cells that need to be processed at the server increases.
  • The amount of data that needs to be sent over the network to the browser increases.
  • The amount of data that needs to be processed in the JavaScript engine of the browser increases. This is especially relevant for Web browsers of mobile devices which have lower hardware capabilities than desktop computers.
  • The number of cells in the browser's Document Object Model (DOM) increases memory consumption and decreases rendering performance of the browser. This is especially relevant for Web browsers of mobile devices which have less memory than desktop computers.

In case of performance problems, it is recommended to check the following guidelines:

  • The number of data cells (“Row Limit” x “Column Limit”) for Crosstabs using "Pixel-Based Scrolling" on mobile devices (for example iPad) should not exceed 500.
  • The number of data cells (“Row Limit” x “Column Limit”) for Crosstabs using "Pixel-Based Scrolling" on desktop computers should not exceed 5000.

Improvements in the Profiling Dialog

Better Code Coverage, More Detailed Messages

With Design Studio 1.5 code is better covered and more detailed messages were added to the output of the Profiling dialog. For example, during Design Studio script execution, the script name is listed ("BUTTON_1.OnClick()")  in the Profiling dialog.

Higher Measurement Accuracy

With Design Studio 1.5 the measurement accuracy has been increased – on the Windows platform, for example, from 16 milliseconds to 1 millisecond.

Remote Times for HANA Systems

With Design Studio 1.5 remote times for HANA systems are explicitly listed in the Profiling dialog. Before Design Studio 1.5 only the times for BW systems were listed.

New Tab "General Information"

With Design Studio 1.5 the tab "General Information" was added to the Profiling dialog. It provides the following information:

  • Timestamp of application execution
  • Name and description of application
  • Details about the data sources of application. For each data source the data source alias, the name of the object (for example the query name in BW systems or the view name in HANA systems) backing the data source, the processing group (when parallel query execution is used), the connection type, and the initialization state are listed.

Streamlined Content of "Download as Text"

With Design Studio 1.5 the content of "Download as Text" has been streamlined by omitting arcane information and all entries with an execution time of 1 millisecond and less. Before Design Studio 1.5 those entries cluttered up the content of "Download as Text" making it hard to spot relevant items with long execution times.

Display of Processing Group Execution

With Design Studio 1.5, the profiling dialog displays the execution steps for each processing group separately with applications that use parallel query execution. Whenever a parallel execution starts, "Execute Processing Groups asynchronously" is displayed in the Profiling dialog, followed by separate lines showing the execution of each processing group.

The separation into processing groups is also reflected in the downloaded content of the Profiling dialog.



What's Next?


As you have seen from this long list of topics this Design Studio release truly was about performance.

As soon as Design Studio hits the market (end of Q2/2015), you will have lots of new possibilities for improved application performance.

By the way, expect more posts on the heavy-weight Design Studio 1.5 performance topics “Parallel query execution” and “Unmerge variables”, including details, tutorials, and more.

Have fun with Design Studio 1.5!



Further Reading

Design Studio Performance Best-Practices

Design Studio: Performance Implications of Variables

Design Studio Tips and Tricks: Measuring Performance

Design Studio 1.5: View on Parallel Data Source Execution


54 Comments
Karol-K
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Toon,

 

in general, it would help to upgrade hardware - but let us check first what kind of events are taking the time.

 

can you upload or send me on private chat the "Statistics Download" as Text AND CSV from Profiling Dialog?

 

Karol

andreas_appelt
Explorer
0 Kudos

Hi Martin,

we applied all tips and tricks to our DS application but still have an initialisation time of about 7 seconds.

We have one command in the startup script that takes about 500ms. Its DS_1.reloadData().

We have to reload DS_1 because otherwise the next code line in the startup script, which is "G_Verbund=DS_29.getVariableValueExt("CO2_VERBUND_EW_CE1");" would not work. DS_1 is defined as "Load in script:false" and I thought that the datasource DS_1 is already initialised and available in startup script without reloading it.

At whích point of time are datasources, that are defined with "Load in script: false", really initialised?

Thanks for your support.

Regards, Andreas

Stefan_Backhaus
Participant
0 Kudos

Hi Andreas,

 

are you running Tomcat on BI Platform, possibly with AD SSO?

We did this and lost about 5 seconds during authentication process.

We discovered, that the BI Platform is doing a reverse DNS lookup to log the hostname of the calling client in the audit database. However, this did not work correctly due to the setup of our DNS infrastructure (I'm not the expert in this area) what caused a wait time till timeout of 5 seconds.

Solution:

By adding this line to the Tomcat Java Startup Options, we saved 5 seconds per each initial report call.

-Dbobje.disable_audit_ip_lookup

 

Best regards

Stefan

andreas_appelt
Explorer
0 Kudos

Hi Stefan,

I learned that some exit variables need the result set of the query. So even if the query is set to "Load in script.false" the data is not available. That's why we have to reload the query. For the rest of the high time the number of datasource (7) are responsible. Unfortunately we need these for the initial screen. We report on BIP and not NW.

Thanks.

Regards, Andreas