Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Code Optimization for handling large volume of data

Former Member
0 Kudos

Hi All,

We are facing a problem when executing a report... lot of time is taken to execute..... Many a times the program is terminated with a dump that "Timeout :Program terminated because of endless loop".

the internal table which has to looped has more than 8.5 lac records...

and for each run of loop there are two read and one select statement (unavoidable)...,,

(We have followed almost all the optimization techniques,,,,)

Please suggest if you have any idea as to ... what can be done in such situation....

Thanks and Regards,

Sushil Hadge.

1 ACCEPTED SOLUTION

christine_evans
Active Contributor
0 Kudos

I also would be concerned about any report that needs to process 8.5 million (I guess lac means million) records on each run and takes over 12 hours to run. What sort of data are you reporting on that you need to do this? You can't be displaying all 8.5 million records on your output so you must be doing some kind of consolidation; is there any way you can record the results of this consolidation in a custom table so that you can 'resuse' some of the output of a previous run in the next run and not have to start it all again from scratch? Or is there some way you can break the report down so that it doesn't have to do everything in a single loop?

It's a bit difficult to suggest anything without knowing anything about your data. I'd go and have another look at the report and see if there is a way to reorganise what it is doing - and there are always options to do things differently.

24 REPLIES 24

Former Member
0 Kudos

Try to run the report in background...since you told the select insode the loop is unaviodable...

Reagrds

Shiva

Former Member
0 Kudos

Hi,

C as you have said that some statements and loops are unavoidable, try running your program in the background. I guess u shudnt get a dump then. Try this

Thanks

Nayan

Former Member
0 Kudos

Hi,

You can avoid the select from the loop using SELECT FOR ALL ENTRIES.

Rgds,

Bujji

Former Member
0 Kudos

I have already tried running it in background....

still it takes lot of time.,.... i had scheduled for run on last night,,,,,now it has reached an execution time of 45000 secs... and it is still running...(Active)..

Edited by: Sushil Hadge on Jun 4, 2008 2:55 PM

Former Member
0 Kudos

Why is the select unavoidable. I would like to see the code. There are always alternatives.

0 Kudos

Hi Martin,

Following is the piece of code.....



SELECT bukrs gpart hkont waers
      FROM dfkkop
      INTO TABLE it_dfkkop
      WHERE bukrs = p_bukrs AND bldat IN so_bldat AND hkont IN so_hkont.
.
SORT it_dfkkop BY gpart.
.
.
Loop at it_dfkkop into wa_dfkkop.
.
.
<Read statement>
.
<Read Statement>
.
.
ON CHANGE OF wa_dfkkop-gpart.
.
.

SELECT gpart hkont waers betrw FROM dfkkop INTO TABLE it_subtot WHERE hkont = wa_dfkkop-hkont AND gpart = wa_dfkkop-gpart.
.
.
      IF it_subtot IS NOT INITIAL.

        LOOP AT it_subtot INTO wa_subtot.

          v_sum = v_sum + wa_subtot-betrw.

        ENDLOOP.
     Endif.  

Endon.
.
.
Endloop.


Please suggest if this can be improved in some way....

Thanks ,

Sushil

Edited by: Sushil Hadge on Jun 4, 2008 3:12 PM

0 Kudos

Why are you reading table dfkkop twice.

How many records are actaully on this table.

0 Kudos

Table dfkkop has large number of records....

after the first select query ....the internal table it_dfkkop has about 8,50,000 records...

For each unique gpart value.... there is a subtotal to be calculated across the table which incudes the reading of records .. which do not form part of the output....

it is a little complicated....hope it is clear,,,,,,

Best Regards,

Sushil

ThomasZloch
Active Contributor
0 Kudos

8.5 lac = 850,000?

Nothing is really "unavoidable" in my humble experience, but you could start by posting the read and select statements here for further analysis. also include how you declared the internal tables that are being read.

Greetings

Thomas

0 Kudos

Yes Thomas .... around 850,000 records....

0 Kudos

Do the read statements use Binary Search.

Are you using an index on the database table.

0 Kudos

Hi Martin,

No i am not using index on database table...

i am not aware of that concept.

Thanks ,

Sushil

0 Kudos

The fields you are using on the WHERE statement. Do they may to an existing index on the table. Ask your basis team to look. If there is not one, then this will certainly slow down the processing.

Also are you using the command BINARY SEARCH for the READ statements. If not you should be using these also.

What key is the READ statemen using. Do you need to do it for each iteration of the LOOP.

0 Kudos

>

> Hi Martin,

>

> No i am not using index on database table...

>

> i am not aware of that concept.

>

> Thanks ,

>

> Sushil

But you must know about database indexes if, as you say, you have used all the optimization techniques. We don't seem to have this table in our 4.7 R/3 system so I can't comment on indexes. But, I'd get rid of that second select and make sure that you first select contains all the data needed to calculate the totals and work with that instead.

Even with 8.5 million records, a program like this should not take 12 plus hours to run and I bet it is that second select that is the problem. Is DFKKOP indexed on HKONT or GBPART? If it is indexed on GBPART does BUKRS (which you're not using in the second select) come before it in the index?

0 Kudos

Hi Martin,

no currently i am not using a binary search.... i will change that now...

thanks for suggesting...

Yes Read statement is required for each iteration.....

....

Thanks ,,,

Sushil.

0 Kudos

Can you post the code for the read statements also. Even though you say they are needed for each iteration. Do you just mean the data is needed or does the key fields you are using change for each pass of the loop.

0 Kudos

Hi Christine ,

I also feel that second select is the problem.

I will remove the second select get all the data from dfkkop that is necessary and

do a Loop in Loop.....

--Sushil

0 Kudos

Hi Martin,

Following are the read statements...



Loop....

READ TABLE it_but000 WITH KEY partner = wa_dfkkop-gpart INTO wa_but000.

   IF sy-subrc EQ 0.

      IF wa_but000-type = 'X'.

          READ TABLE it_dfkkbp WITH KEY partner = wa_dfkkop-gpart taxtype = 'XXX' INTO wa_dfkkbp.
           ..... 
      ENDIF
.....
ENDIF....

Endloop.

--Sushil

0 Kudos

Are you telling me that the field wa_dfkkop-gpart is different for each iteration. This field value is not duplciated in the entire internal table.

0 Kudos

wa_dfkkop-gpart value can be same for many iterations .....

it would be good .. if i move it in the

ON - ENDON for wa_dfkkop-gpart...................Thanks for suggesting

I would like to close this thread for now..... Thanks to all those who have provided valuable inputs....i will try to implement these techniques....

Best regards,

Sushil Hadge

Edited by: Sushil Hadge on Jun 4, 2008 4:57 PM

christine_evans
Active Contributor
0 Kudos

I also would be concerned about any report that needs to process 8.5 million (I guess lac means million) records on each run and takes over 12 hours to run. What sort of data are you reporting on that you need to do this? You can't be displaying all 8.5 million records on your output so you must be doing some kind of consolidation; is there any way you can record the results of this consolidation in a custom table so that you can 'resuse' some of the output of a previous run in the next run and not have to start it all again from scratch? Or is there some way you can break the report down so that it doesn't have to do everything in a single loop?

It's a bit difficult to suggest anything without knowing anything about your data. I'd go and have another look at the report and see if there is a way to reorganise what it is doing - and there are always options to do things differently.

0 Kudos

Hi Christine ,

The number of records are huge.. bcoz...the report is a Yearly report....

The report will be executed in the background.... and then a file will be created from the spool....and placed on the application server..

...

the program is working fine for 60000 to 80000 records........

.

i am also trying to find a way to either optimize if possible ... or to breakdown the Report/Loop...

Thanks and Regards,

Sushil.

Former Member
0 Kudos

Hi,

i. Make your all internal table as a Hashed Table . This will surely reduce runtime of your program.

ii. Try to use 'READ TABLE' concept inside loop instead of useing select query.

Reward points if helpful.

With Thanx,

Durai murugan.

Former Member
0 Kudos

Would like to implement the discussed techniques ...