ā2015 Jan 15 10:12 AM
Hi guys,
I have one requirement it is GIN(goods receipt and invoice data) Alv report from MB5S:Tcode.There are so many tables are there.I am retrieving data from tables by using 'FOR ALL ENTRIES' Query.Even the database performance shows 65%.Again i have create secondary index for non-key fields .Now the database performance show 55%.I want to decrease upto 30%.How it's possible if any one knows reply me.
Thanks & Regards,
Mahesh K
ā2015 Jan 15 10:25 AM
Avoid for all entries and try to use inner join. You also can check "parallel cursor" technique.
ā2015 Jan 15 10:35 AM
Dear Thiago,
Thanks for giving reply.I am using parrallel cursor technique .i.e., for abap performance but i need to improve the database performance.How it is .
Thanks & Regards,
mahesh k
ā2015 Jan 15 10:26 AM
Hi Mahesh,
what u want to do exactly? To increase a performance of MB5S
or u have a similar report *Custom developed Z rep.* ? and for that u want to have an optimization techniques?
Thanks.
Rudra
ā2015 Jan 15 10:39 AM
Dear Rudra,
I have develop a custom report.I have the improve the database performance .What are the techniques to do .Already i am using 'FOR ALL ENTRIES' in select query and secondary index on non-key fields but it shows upto 55%.But i need upto 30%.How to do this
Thanks & Regards,
Mahesh k
ā2015 Jan 15 10:56 AM
Hi,
Your requirement is really strange.
If you want 30% database then simply increase the ABAP stuff.
Also keep in mind that the way abap runtime performance and database acces in development works can be different from production.
Kind regards, Rob Dielemans
ā2015 Jan 15 1:59 PM
Mahesh,
I believe u must be having a extensive data! That's inevitable in some cases like yours.
Now the basic things explained by Thomas is abs. must to be considered.
For better performance I would suggest - use packet wise reading in select query and handling a packet for the further calculation/final internal table creation part.
You can have a look on - OPEN CURSOR ^ BY PASSING BUFFER ^
Also have a look on -
1. Select statement where u need not to have *
2. WHERE condition part plays a crucial role, have as much as possible KEYS in the statement
3. Have a keep on ST05. SQL trace.
I think u can do it ... many times this is been discussed.
regards.
Rudra
ā2015 Jan 15 11:16 AM
Forget the percentages and rather look at absolute runtimes.
Run an ABAP trace with transaction SAT and sort the result list descending by "net time", the top few items in the list will be your performance bottlenecks to be investigated further (e.g. inefficient DB access, exponential growth of internal tables, inefficient loop inside loop constructs, too many RFCs, etc.).
This has been discussed many times, please search for available information.
Thomas
ā2015 Jan 15 11:29 AM
Dear Thomas ,
I have run se30(old one).For SAT how to run this and how to find this(e.g. inefficient DB access, exponential growth of internal tables, inefficient loop inside loop constructs, too many RFCs, etc.).
ā2015 Jan 15 11:40 AM
ā2015 Jan 15 11:46 AM
Dear Thomas,
I send a screen shot of Hit list .So,please find it.
ā2015 Jan 15 1:32 PM
Almost there, tab "Hit list" sorted by column "Net" descending would be better.
We can already (barely) see that DB access to RSEG, MSEG and LFA1 takes some time, so investigate the SQL statements and make sure that you use primary or secondary key fields in your where conditions.
Please also search for previous discussions around how to solve specific performance issues.
Thomas
ā2015 Jan 15 11:18 AM
Hii mahesh
it depends on amount of data you want to fetch as well, along with your select-option and where clauses you used. make sure you used maximum possible key fields so that db time is reduced
Regards
Gaurav
ā2015 Jan 15 11:23 AM
Dear Gaurav,
Thanks for giving reply.I am using key fields only .If For non-key fields i am create a secondary index then it shows 55%.
ā2015 Jan 15 11:27 AM
Hi Mahesh,
Follow the below methods, what we follows while developing or code.
1. Fetch the Data from table through Columns / Fields based query, try to avain '*' while doing query.
2. Always user FIELD-SYMBOL to read or loop the Data from Internal table.
3. Avain Loop inside Loop statement, try to do Index Looping always.
4. Check Internal Table Initialization, before do the For All Entries.
5.Try to avaind reuse component, if possible make the logic globle.
6. If possible, try to use parallel Process techniques.
7. Sort the table before Read from Internal Table, user BINARY SEARCH.
8. Sort the Information before delete the data from Internal table and user 'COMPARING FIELDS-NAME'
9. Before go for FOR ALL ENTRIES, try to sort the Information or if possible try to pass unique information in internal table (Repeatative maybe takes time to fetch records from Tables )
Guys, if you have some more point to consider, please add.
Reagrds.
Praveer.
ā2015 Jan 19 5:15 AM
ā2015 Jan 15 11:29 AM
Dear mahesh
db performance depends on lot many things not only on abap code you write like server load indexing at db level system paramerers etc, so optimize your source code as much as possible thats only thing you can do at abap level
Regards
Gaurav
ā2015 Jan 15 11:30 AM
Dear mahesh
db performance depends on lot many things not only on abap code you write like server load indexing at db level system paramerers etc, so optimize your source code as much as possible thats only thing you can do at abap level
Regards
Gaurav
ā2015 Jan 15 12:05 PM
Hi Mahesh,
Try to use joins instead of for all entries. Use joins with key fields.
This will surely improve the performance.
Regards,
Pankaj
ā2015 Jan 16 8:46 AM
Hi Mahesh,
Please send us your program source code and snap shots of the secondary indexes you created.
Thank you,
John
ā2015 Jan 16 9:04 AM
Hello,
your problem is you wanted to reduce the no of hits to the database and in order to improve the performance Please follow the below steps...
1. Before using for all entries command, just copy the internal table data into temp internal table which contains same structure and then delete the duplicate entries from that internal table and then use it..
EX:
A is an internal table which is having 2000 entries.
A_Copy is an temp internal table.
A_Copy = A. "Copying the values from internal table A to A_Copy
Sort A_Copy Ascending by field-name1 field-name2. "Sort it by the field which u r using in where cond.
DELETE ADJACENT DUPLICATES FROM A_Copy COMPARING field-name1 and field-name2.
2.in the where condition use as listed in INDEXES for the particular table if any,
try to avoid passing duplicate entries, that will improve the performance more than 20%...
Hope this will be helpful for you....
If still you are facing the same issue, just share the select query completely will give you the appropriate solution..
Regards
Balanand S
ā2015 Jan 16 11:17 AM
hii mahesh
Please provide fields you want to display in output and tables you used in your program
Regards
Gaurav
ā2015 Jan 16 11:29 AM
Also you can add whether table is inital or not . If Initial then dont execute taht query . Because Initail table degrade performace of query as it considers all entries of table .
for e.g.
IF entry_tab is not initial.
SELECT carrid connid fldate
FROM sflight
INTO CORRESPONDING FIELDS OF TABLE sflight_tab
FOR ALL ENTRIES IN entry_tab
WHERE carrid = entry_tab-carrid AND
connid = entry_tab-connid.
ā2015 Jan 17 5:25 AM
Dear sneha,
I am using INITIAL CONDITION IN FOR ALL ENTRIES.Any how thank u very much i got the output.
Thanks & Regards,
Mahesh K
ā2015 Jan 18 12:07 PM
Hi Mahesh - If you have achieved what you were looking for then I assume you should close this thread as answered with giving proper rewards to the experts who have given their valuable thoughts!
- Thanks Somnath