Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Ayman_Sourki
Explorer
5,951

Processing data poses great challenges for system performance, especially with rising data volumes. SAP Datasphere therefore offers us the ability to analyze performance and to locate problems. In this blog we will be taking a closer look at this capability

SAP Datasphere uses SAP HANA Cloud DB. Therefore, tools for the HANA performance analysis can also be used for the performance analysis in the Datasphere.

The Data Integration Monitor


The Data Integration Monitor offers support in monitoring and managing table replication from sources outside the SAP Datasphere and indicates the frequency of data updates.

It is easy to see how many tables there are, how much memory they require and what actions have recently been performed. Data can also be replicated here, persistent views can be created, remote connections can be analyzed and details on these actions can be queried.

 

ayms1_1-1706625117109.png

Figure 1 – The Integration Monitor

Query runs, in which the Datasphere understands all database queries, can also be monitored in the Data Integration Monitor.

For detailed analyses, Plan Viz can be used as in SAP HANA On Prem systems. The following section describes the step-by-step process of the HANA PlanViz analysis.

Generate a SQL code for the Datasphere data model



    • Log into the Datasphere and open the data model you would like to analyze.

 

    • Under Edit, click on “Preview SQL”.



ayms1_2-1706625165641.png

Figure 2 – Steps in generating the SQL Code

The code is displayed and can be used in the next step for generating the PlanViz file.

ayms1_3-1706625186591.png

Figure 3 – The generated SQL code

Generating the Plan Viz output file


For this step, a Database Analysis User is required for the Datasphere. The user is created with the following steps:

    • In Datasphere click on System – Configuration.



ayms1_4-1706625218805.png

Figure 4 – Creating a Database Analysis User

    • Under Database Analysis the DB Analysis User can be created. First, assign the technical user name and select the option “Enable Space Schema Access”. You still have the option to indicate a time period in which the access should occur.

 

ayms1_5-1706625237509.png

Figure 5 – Possible options in creating a Database Analysis User

    • Now a password is generated. The password should be copied and backed up, because it will be needed later.



ayms1_7-1706625256768.png

Figure 6 – Password when creating a Database Analysis User

    • In the same “Database Access” tab, select the user and then click on “Open Database Explorer”



ayms1_8-1706625276529.png

Figure 7 – Open Database Explorer

A new window opens for logging in to the SAP Business Technology Platform. You can log in using “Sign in with default identity provider”. The DB Analysis User and the generated password are used for this.

ayms1_9-1706625295255.png

Figure 8- Logging in to SAP Business Technology Platform

After successful log in, the Datasphere will be visible with its objects on the left side of the window. The SQL console can be opened by right-clicking on the instance using the selection “Open SQL Console” in the context menu.

ayms1_10-1706625308207.png

Figure 9 – The SQL Console

    • The generated SQL code can be inserted here. The schema, in our example “Einkauf_Demo”, must be added in front of the table name.

 

    • Click run to check the SQL code.

 

ayms1_11-1706625318694.png

Figure 10 – Testing the generated SQL Code

Next, the PlanViz file is generated with “Generate SQL Analyzer Plan File” under “Analyze”.

ayms1_12-1706625329116.png

Figure 11 – Generating the PlanViz file

Analyzing the visualized HANA plan


After saving the Plan Viz file on our local system, the processing can be analyzed more closely, as follows:

    • Start Eclipse/HANA Studio

 

    • In the menu, click on File and select the generated file with “Open File”.



ayms1_13-1706625344358.png

Figure 12 – Opening the PlanViz file

    • Now the file opens and you can perform an analysis of the individual operations (e.g. joins, calculating times).



Under “Overview” you can find general information on processing times and memory use.

ayms1_14-1706625372835.png

Figure 13 – View of the PlanViz file

Under “Executed Plan” you fill find detailed information on the processing times of the individual operations with the relevant number of data records, the server accounts used and any network transfers.

ayms1_15-1706625392545.png

Figure 14 – Executed Plan general display

Each box in the Plan Viz output represents a Plan Operator (POP). In the Plan Viz display, data runs from bottom to top. The query results can be found in the view and the actual data query from database tables is located at the bottom. The plan operators in between represent the transformations. In each box, you can click on the arrow at the top right to go to other operators up to the source.

ayms1_16-1706625406298.png

Figure 15 – Executed Plan detailed display

The exclusive time in the node refers to the time that is needed to process an individual operation. The inclusive time is the time required to process the entire operation including the time of lower order operators and excluding compiler time.

Load times with long processing times and large data quantities are potentially critical areas, and for this a detailed analysis of the underlying part of the data model is recommended. In our example, the left load step with 29,953 data records and a processing time of just 3 seconds is one of these.

Summary


As can be seen, with just a few simple steps a Plan Viz file can be created from the SQL code of a Datasphere object and analyzed. This is extremely helpful in identifying potential performance bottlenecks in the Datasphere and correcting them.

 

 

This post was first published at Nagarro (nagarro-es.com)

5 Comments
Labels in this area