Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Select Query taking too much time

Former Member
0 Likes
4,843

Hi All,

I have written a Select Query for which I also created a secondary Index in MSEG.

The Fields are non-key fields and during my analysis in ST05, I could see that the index is being hit when select query is executed.

Now in production, we have a huge data in MSEG approx 50 million.

And when we run our report in Background this select Query consumes almost 10-12 hours. This select query is puling the data from MSEG of past 1 year (budat_mkpf).

Below is the Select Query:

SELECT mblnr

               mjahr

               zeile

               bwart

               matnr

               werks

               sobkz

               lifnr

               shkzg

               waers

               menge

               meins

               ebeln

               ebelp

               budat_mkpf

                 
FROM mseg        

                  INTO TABLE li_quant

      
              
FOR ALL ENTRIES IN  li_contract1

                 
WHERE  matnr      = li_contract1-matnr

                  
AND   werks      = li_contract1-werks

                  
AND   lifnr      = li_contract1-lifnr

                  
AND   budat_mkpf >= l_datum_v       

                  
AND   sobkz      IN ('K' , ' ')

                  
AND   bwart      IN li_bwart[].

Also If I reduce my select Query  as below

SELECT mblnr

               mjahr

               zeile

               bwart

               matnr

               werks

               sobkz

               lifnr

               shkzg 

               waers

               menge

               meins

               ebeln

               ebelp

               budat_mkpf

                 
FROM mseg

                  INTO TABLE li_quant

                  
WHERE  budat_mkpf >= l_datum_v.

and change the secondary index accordingly will it be useful ?

As in our development system we don't have that much data for testing.

Also is there any other way we could increase the Performance.

We cannot use Primary Keys as the requirement is such.

Thanks in Advance,

Rahul Koul

14 REPLIES 14
Read only

wol
Active Participant
0 Likes
3,026

Hello Rahul,


how many entries are in itab li_contract1?


Other point, try to figure out which select criteria reduce result most, then only use these if you have indices.


Regards

Stefan

Read only

Former Member
0 Likes
3,026

Thanks for replying Stefan.

The LI_contract1 table contains around 60,000 entries.

SO what my second thought is to select the data from MSEG based on Date only. And then delete the data accoringly with refrence from  the LI_contract1 table.

But my  doubt is by decreasing the number of conditions in Where clause will it take more time to execute. I will change the secondary index accordingly and place only one field that is BUDAT_MKPF  in the index.

Also I have considered using CURSOR statement.

Will that be help full .
As last time BASIS team came up with the issue of Production being hanged due to Consumption of secondary logs by this report.

Read only

Former Member
0 Likes
3,026

HI Rahul,

this is common problem with MSEG table.

the queary time depends upon the no.of records you expect in the the select query. If you still want to go deeper with this, check with the execution plan from your basis consultants. based on the capacity provided in theecection plan, you can get an overview about the no.of records and time required aswell.

creating a secondary index for the above fields can help you better.

the best solution is to narrow down the selection criteria. i doubt whether this is possible.

There are couple of sap note which are to be implemented for the same issue, you can try those, get these by browsing on google.

thanks and Regards,

Bhaskar


Read only

0 Likes
3,026

Steel

Thanks for replying.

The LI_contract1 table contains around 60,000 entries.

SO what my second thought is to select the data from MSEG based on Date only. And then delete the data accoringly with refrence from  the LI_contract1 table.

But my  doubt is by decreasing the number of conditions in Where clause will it take more time to execute. I will change the secondary index accordingly and place only one field that is BUDAT_MKPF  in the index.

Also I have considered using CURSOR statement.

Will that be help full .
As last time BASIS team came up with the issue of Production being hanged due to Consumption of secondary logs by this report.

Read only

Former Member
0 Likes
3,026

Hi Rahul,

The changes to secondary index will be useful. Try to modify the secondary index as per the selection criteria and restrict number of fields to 4 in secondary index. The fields in secondary index should appear in same order as they are in selection criteria.

Thanks

Abhi

Read only

0 Likes
3,026

Hi Abhi,

You can see the screen shot of the Select query and the associated index for that select query.

But still its not helping.

Read only

0 Likes
3,026

Great.

Will it be possible for you to restrict the number of fields and check? Usually, secondary index is helpful when there are 3-4 fields are defined.

Along with that, you can put SQL trace as well, to check where this query is taking maximum time.

Thanks

Abhi

Read only

0 Likes
3,026

Hi Abhi I am trying with a select query with nly one condition in the where clause as BUDAT_MKPF now.

And will make a secondary index for the same.

But I cannot conclude if this is gonna work better as in development we don't have that much of data.

This issue comes only in production system.

SELECT mblnr

               mjahr

               zeile

               bwart

               matnr

               werks

               sobkz

               lifnr

               shkzg

               waers

               menge

               meins

               ebeln

               ebelp

               budat_mkpf

              
FROM mseg

                 
WHERE  budat_mkpf >= l_datum_v.

Read only

Former Member
0 Likes
3,026

Hi Rahul,

As you are using MSEG table to fetch data it will take long time if you r not using the primary key fields ( MBLNR). as you are using MATNR field in your for all entries

I suggest you to try to get the MBLNR field from material related table where you can give  MATNR as key field  and from table link with MSEG and fetch using MBLNR .

hope this solves your problem.

Regards

Rounak

Read only

Former Member
0 Likes
3,026

Hi Rahul,

Go to ST05 transaction and check the performence of both the select statements. So that you will get some idea which query  to use and time taking for each statement.

Thanks & Regards,

Goutam Kolluru.

Read only

Former Member
0 Likes
3,026

Try to use the key fields of MSEG table in where condition..

Read only

Former Member
0 Likes
3,026

Hi,

first check if the secondary index is really used in the execution plan for the query. You can also use HINT clause to be sure, that secondary index is in use:

SELECT [..] FROM [..]

    WHERE [..] GROUP BY [..] HAVING [..]

    ORDER BY [..]

   %_HINTS <selektor> '<text>' <selektor> '<text>' [..] .

More about hints you can find in OSS note 12935. Important is order of fields in the where clause and also all used in where fields should be included in index.

Hope it will help.

BR

Paul

Read only

Former Member
0 Likes
3,026

Please run the SAT transaction using Aggregate 'call per transaction' option for a performance analysis.

Analyze the performance killers which is the 'SELECT' query in your case, since the no. of records in the production environment is about 50 million records use an 'INNER JOIN'  instead of FAE. Moreover I would suggest try to avoid the expensive IN operators if possible. Also use secondary index as mentioned by you.

Keep us updated on this post; an interesting one


Regards,

Tanmoy




Read only

naveen_inuganti2
Active Contributor
0 Likes
3,026

Hi Rahul,

One of the major problem that I see with your scenario is BIG DATA!

As your production system is having lot of data, below two areas are consuming additional time.

1. Memory allocation to the program

2. Filtering data while selecting data from MSEG(or any other table)

We need to work on these issues as top priority.

As you know, you can reduce time that is taking for memory allocation (or extending program memory) with help of Open/Fetch Cursor  commands. You use target as work area (if there are any control break events while processing data) or internal table with certain package size.

Using INDEX or INDEX SCAN will definitely help in achieving better results in performance. However, searching with duplicate values and searching with too many input values and search functions with LE or GE will kill the performance.

Some how you need to check your changes against good volume of data (may be you can use your sandbox system for that and load some dummy data).

You can also incorporate Oracle Hints for SELECT query that is using FOR ALL ENTRIES. This will have significant effect when your FOR ALL ENTRIES is having too many inputs.

&min_in_blocking_factor nn&&max_in_blocking_factor nn&&min_blocking_factor nn&&max_blocking_factor nn&

Typically, I would recommend trying 20 for a blocking factor…the default is 5.  The main reason for the low number is that once the total number of bind variable in the where clause reach about 100, the SQL tuning adviser fails to run.  The SQL tuning adviser is an oracle tool that can execute tests using previous bind variables and if it appears there is a better plan(than the default database plan), provides a profile that I can implement to override the default database plan.  This has proven very handy in emergency production situations and on tables where there are already too many indexes to add another.

Regards,

Naveen Inuganti