Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Speed improvement at Loop-Statement

michael_fallenbchel
Active Participant
0 Likes
2,767

Hi experts,

I've got a problem with a short loop-statement:

First of my programm, an internal table have been filled (with about 50000 entries). Then I have to do a loop at this internal table to find all entries which I can sum (depends on 4 criteria).

This is my code:

FORM sum_gleiche_positionen.

  DATA: ls_output  TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        ls_table   TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        lt_table   TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        lt_results TYPE match_result_tab,
        l_netwr    TYPE netwr,
        l_kwmeng   TYPE p DECIMALS 0,
        l_counter  TYPE i..


  SORT gt_output BY vbeln matnr zzwldat abgru ASCENDING.

  LOOP AT gt_output INTO gs_output.

    CLEAR: l_netwr, l_kwmeng, l_counter.


    READ TABLE gt_sumtable  INTO lt_table
                            WITH KEY vbeln   = gs_output-vbeln
                                 matnr   = gs_output-matnr
                                 zzwldat = gs_output-zzwldat
                                 abgru   = gs_output-abgru.
    IF sy-subrc = 0. "Beleg schon verarbeitet
      CONTINUE.
    ENDIF.



    LOOP AT gt_output INTO ls_output WHERE vbeln   = gs_output-vbeln   AND
                                           matnr   = gs_output-matnr   AND
                                           zzwldat = gs_output-zzwldat AND
                                           abgru   = gs_output-abgru.

      l_netwr  = l_netwr + ls_output-netwr.
      l_kwmeng = l_kwmeng + ls_output-kwmeng.

      ADD 1 TO l_counter.


    ENDLOOP.
    ls_table = ls_output.
    IF l_counter > 1.
      CLEAR: ls_table-pstyv, ls_table-posnr.
      ls_table-netwr = l_netwr.
      ls_table-kwmeng = l_kwmeng.
    ENDIF.
    APPEND ls_table TO gt_sumtable.

  ENDLOOP.

ENDFORM.                    "sum_gleiche_positionen

But this needs to much time -> Time out!

What can I do to improve speed in this statement?

1 ACCEPTED SOLUTION
Read only

Former Member
0 Likes
2,612

it is much faster, but it is not really fast.

The recommeded solution was the COLLECT which would be faster by factors and much shorter coding.


LOOP

move 6 fields

COLLECT

ENDLOOP.

Collecting 50.000 records can not take seconds, with good hardware about 1microsecond per record,

gives, 50ms, there is a factor of 100 still left in your solution!

Siegfried

29 REPLIES 29
Read only

Former Member
0 Likes
2,612

Hi Michael,

Do not use loop within a loop. Here is what you can do:

First, remove the unecessary entries then do the computation you want:


LOOP AT gt_output
     READ TABLE gt_sumtable 
          IF sy-subrc EQ 0.
               gs_output-flag = X.
               MODIFY gt_output FROM gs_output TRANSPORTING flag.
          ENDIF.
ENDLOOP.
DELETE gt_output[] WHERE flag = 'X'.

IF gt_output[] IS NOT INITIAL.
     LOOP AT gt_output.
" Do your calculation logic
     ENDLOOP.
ENDIF.

Regards,

Mawi

Edited by: Mawi C. Ng on May 4, 2009 10:04 AM

Edited by: Mawi C. Ng on May 4, 2009 10:04 AM

Read only

0 Likes
2,612

Hi mawi,

first - thanks for your reply.

But I'm not sure if I understand you right:

In your code, you do a loop on my itab (gt_output) and then delete all entries that matches my statement.

After that, I make a "new loop" on my itab and do the calculation. But this cannot work becuase all entries I want to calculate with are deleted...

Am I right?

Read only

ThomasZloch
Active Contributor
0 Likes
2,612

At quick glance it seems that you should look into using the COLLECT statement to aggregate NETWR and KWMENG by VBELN, MATNR, ZZWLDAT and ABGRU. Maybe you can even do it already during data selection with the aggregate function SUM ().

Do you need GT_OUTPUT for further processing or is GT_SUMTABLE your final result?

Thomas

Read only

0 Likes
2,612

Hi Thomas,

gt_sumtable is my final result.

Unfortunatelly, I cannot sum NETWR and KWMENG during data selection because the user want to select if he needs the data summed or not.

I will have a look at the collect statement - maybe this can help :9

Thanks a lot!

Read only

0 Likes
2,612

And if COLLECT should not work, at least make sure that you access internal table data inside an outer loop always with binary search, either explicit (READ TABLE ... BINARY SEARCH) or implicit (READ or LOOP via key fields of tables declared as SORTED).

Thomas

P.S.

> the user want to select if he needs the data summed or not

this could also be handled via ALV-list layouts.

Read only

Rui_Dantas
Active Contributor
0 Likes
2,612

You have 2 performance problems, and they can both be solved by using binary search.

Since the table is not declared as SORTED, the easiest way to adapt your code would be:

1) For the LOOP inside the LOOP, make a READ first with BINARY SEARCH, and then LOOP FROM sy-tabix (with no WHERE, and EXIT when you find a record that does not match your criteria).

2) For the READ inside the LOOP, add BINARY SEARCH (your results table is also sorted from what I see, so that should be ok).

Hope it is clear enough.

Read only

Former Member
0 Likes
2,612

Hallo Michael,

your coding is a perfect example for quadratic coding and long long runtimes ....

Please read here:

Nonlinearity: The problem and background

/people/siegfried.boes/blog/2007/02/12/performance-problems-caused-by-nonlinear-coding

Measurements on internal tables: Reads and Loops:

/people/siegfried.boes/blog/2007/09/12/runtimes-of-reads-and-loops-on-internal-tables

The first will explain the problem, the second the main reason for nonlinearity.

Use SORTED tables, whenever you do nested processing!

Or the more complciated standard tables, with binary search and LOOP ... see second blog

Sort the inner table not the outer table (that is waste of time!)

But your program has other problems:

The READ has not consequence, or? and it is INOT ls_table


READ TABLE gt_sumtable  INTO lt_table
                            WITH KEY vbeln   = gs_output-vbeln
                                 matnr   = gs_output-matnr
                                 zzwldat = gs_output-zzwldat
                                 abgru   = gs_output-abgru.

The loop can stop if the counter is larger than 1, there are no other consequences.

I don't think that your LOOP inside the LOOP makes sense, it is on the same table,

If the inner loop finds two records or more, than it is also executed identically

two times or more, because the outer loop has the same records.

What is the actual task?

Siegfried

Read only

0 Likes
2,612

Hi all,

hope I can response to all your relies

@Thomas: Sure, I can manage it with ALV - but not all requirements. I will check the collect statement asap.

@Rui: Yes, I unserstand your idea! I'm not sure if this can help me improve speed (is it faster to make a loop with "loop at xy into i_xy from 1" than making it with "loop at xy into ixy where..."?)

@Siegfried: You are right - it's not a nice coding. I will read your links asap! And sure, my read statement has a consequence: if the actual data has already been read (and so been written in table gt_sumtable), the next record will been read.

Code I have at the moment:

FORM sum_gleiche_positionen.

  DATA: ls_output  TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        ls_table   TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        lt_table   TYPE STANDARD TABLE OF gs_out WITH HEADER LINE,
        lt_results TYPE match_result_tab,
        l_netwr    TYPE netwr,
        l_kwmeng   TYPE p DECIMALS 0,
        l_counter  TYPE i,
        l_tabix    TYPE i.


  SORT gt_output BY vbeln matnr zzwldat abgru ASCENDING.

  LOOP AT gt_output INTO gs_output.
    l_tabix = sy-tabix.

    CLEAR: l_netwr, l_kwmeng, l_counter.


    READ TABLE gt_sumtable  INTO lt_table
                            WITH KEY vbeln   = gs_output-vbeln
                                 matnr   = gs_output-matnr
                                 zzwldat = gs_output-zzwldat
                                 abgru   = gs_output-abgru.
    IF sy-subrc = 0. "Beleg schon verarbeitet
      CONTINUE.
    ENDIF.

    LOOP AT gt_output INTO ls_output FROM l_tabix.

      IF ls_output-vbeln   NE gs_output-vbeln   OR
         ls_output-matnr   NE gs_output-matnr   OR
         ls_output-zzwldat NE gs_output-zzwldat OR
         ls_output-abgru   NE gs_output-abgru.
        EXIT.
      ENDIF.

      l_netwr  = l_netwr + ls_output-netwr.
      l_kwmeng = l_kwmeng + ls_output-kwmeng.

      ADD 1 TO l_counter.

    ENDLOOP.


    ls_table = gs_output.
    IF l_counter > 1.
      CLEAR: ls_table-pstyv, ls_table-posnr.
      ls_table-netwr = l_netwr.
      ls_table-kwmeng = l_kwmeng.
    ENDIF.
    APPEND ls_table TO gt_sumtable.

  ENDLOOP.

ENDFORM.                    "sum_gleiche_positionen

Read only

0 Likes
2,612

Hi Michael,

Yes, you can believe it will be MUCH faster. Just try it and you will be amazed at the wonders of binary search..

What you have now is:

LOOP.
   LOOP WHERE.

The second LOOP will go throught all the 50,000 entries to find the ones that match (and you do this second LOOP 50,000 times!).

You can easily change that to:

LOOP.
      "you can use binary search because the table was sorted in advance
      "it will be quite fast
      READ ... WITH KEY... BINARY SEARCH. 

      "now, no need to go through all entries; just start by the first one found by READ
      "this access is pretty much immediate
      LOOP ... FROM sy-tabix.       

           "you have to make sure you quit the LOOP when a non-matching entry is found
           IF   ... <> ... OR .... <> .... OR ... .
                  EXIT.
           ENDIF.
    
           ......
       ENDLOOP.
ENDLOOP.

Edited by: Rui Pedro Dantas on May 4, 2009 12:07 PM

Edited by: Rui Pedro Dantas on May 4, 2009 12:08 PM

Read only

0 Likes
2,612

Hi Rui,

OK, when you say it will be faster, than it will be faster

But I don't need the Read-Statement!

LOOP.

"you can use binary search because the table was sorted in advance

"it will be quite fast

READ ... WITH KEY... BINARY SEARCH.

I already know from which line I have to start (the line, in qhich my Outer Loop is):

LOOP AT gt_output INTO gs_output.
    l_tabix = sy-tabix.

So I can the inner Loop this way:

LOOP AT gt_output INTO ls_output FROM l_tabix.

I will test this in our Test system - hope it will improve speed a little bit

Thank you Rui!

Read only

Former Member
0 Likes
2,612

but you need the BINARY SEARCH .... in your first READ!

also sort that table, actually not I think it is created, in the loop, so just add BINARY SEARCH.

also the append is then o.k.

The second is o.k.

Overall it is still hard to understand, there is still improvement possible.

Start also the first LOOP from tabix, and use the tabix where the second loop was when

the exit happened, then also the read is not necessary.

But I must say, Thomas was right with his statement, your program is not more then a complicated way to programm a COLLECT, please do that:


*  no sort !! SORT gt_output BY vbeln matnr zzwldat abgru ASCENDING.
 
  LOOP AT gt_output INTO gs_output.
     COLLECT gs_output  INTO TABLE gt_sumtable.

   ENDLOOP..

gt_sumtable is hashed table with UNIQUE KEY

vbeln = gs_output-vbeln

matnr = gs_output-matnr

zzwldat = gs_output-zzwldat

abgru = gs_output-abgru.

plus these 2 fields

l_netwr = l_netwr + ls_output-netwr.

l_kwmeng = l_kwmeng + ls_output-kwmeng.

Better use a hashed table, but standard is also possible.

Siegfried

Read only

0 Likes
2,612

Hi Siegfried,

thank you very much for your reply!

Just one question:

LOOP AT gt_output INTO gs_output.
     COLLECT gs_output  INTO TABLE gt_sumtable.
ENDLOOP.

When I understand the SAp-Help right, then a collect can be used sum all entries that are not key-fields. Problem is, that gt_output has also other fields. Maybe I have to make a new itab just with this special fields I want to sum...I have to test it...

Read only

Former Member
0 Likes
2,612

just move the six fields explicitly inside the loop before you do the COLLECT.

The definition refers to the gt_sumtable

Siegfried

Read only

0 Likes
2,612

You won't believe it - I made the statement "completely" new - now it works in about 5 seconds!!

Made it this way:

SORT gt_output BY vbeln matnr zzwldat abgru ASCENDING.

  l_tabix = 1.
  DESCRIBE TABLE gt_output LINES l_rows.
  DO.
    IF l_rows = l_tabix. l_ende = 'X'. ENDIF.
    CLEAR: l_netwr, l_kwmeng, l_counter.
    READ TABLE gt_output INTO gs_output INDEX l_tabix.
    LOOP AT gt_output INTO ls_output FROM l_tabix.

      IF ls_output-vbeln   NE gs_output-vbeln   OR
         ls_output-matnr   NE gs_output-matnr   OR
         ls_output-zzwldat NE gs_output-zzwldat OR
         ls_output-abgru   NE gs_output-abgru.
        l_tabix = sy-tabix.
        EXIT.
      ENDIF.

      l_netwr  = l_netwr + ls_output-netwr.
      l_kwmeng = l_kwmeng + ls_output-kwmeng.

      ADD 1 TO l_counter.

    ENDLOOP.
    ls_table = gs_output.
    IF l_counter > 1.
      CLEAR: ls_table-pstyv, ls_table-posnr.
      ls_table-netwr = l_netwr.
      ls_table-kwmeng = l_kwmeng.
    ENDIF.
    APPEND ls_table TO gt_sumtable.
    IF l_ende = 'X'. EXIT. ENDIF.
  ENDDO.

Thank you all vor your input!

Read only

0 Likes
2,612

Might be fast, but will you understand this code part yourself four weeks later, let alone a third person?

Read only

Former Member
0 Likes
2,613

it is much faster, but it is not really fast.

The recommeded solution was the COLLECT which would be faster by factors and much shorter coding.


LOOP

move 6 fields

COLLECT

ENDLOOP.

Collecting 50.000 records can not take seconds, with good hardware about 1microsecond per record,

gives, 50ms, there is a factor of 100 still left in your solution!

Siegfried

Read only

0 Likes
2,612

Ok, tomorrow I will do a test with the collect, but for today it's enough

Read only

0 Likes
2,612

Hi Siegfried,

sorry for the new question - but I'm not sure how to make the whole statement

LOOP
 
move 6 fields
 
COLLECT
 
ENDLOOP.

How to do the move? Do I need a new table?

Read only

HermannGahm
Product and Topic Expert
Product and Topic Expert
0 Likes
2,612

Hi Michael,

attached code should be enough. For gt_sumtable use a hashed table with key vbeln, matnr, zzwldat, abgru.

If you use standard table don't make any other change (besides the collect) to gt_sumtable.

Kind regards,

Hermann


LOOP AT gt_output INTO ls_output .
 
 gt_sumtable-vbeln =  ls_output-vbeln.   
 gt_sumtable-matnr =    ls_output-matnr.  
 gt_sumtable-zzwldat =  ls_output-zzwldat.
 gt_sumtable-abgru =  ls_output-abgru.
 gt_sumtable-netwr = ls_output-netwr.
 gt_sumtable-kwmeng = ls_output-kwmeng.
 
      collect gt_sumtable.
 
 ENDLOOP.
 

Read only

0 Likes
2,612

Hi Hermann,

perfect, exactly what I wanted!

Last question - how to add the fields that I don't have in the hashed gt_sumtable?

Read only

0 Likes
2,612

Hi,

declare the structure manually using the fields which are required instead of using the standard one..

Regards,

Siddarth

Read only

0 Likes
2,612

I've declared the Hash-Table (gt_sumtable) manually, just with the 6 fields (vbeln, matnr, abgru & zzwldat as key, netwr and kwmeng as type p).

After the collect I've got a "summed" table - now the extra information (the other fields of gt_output) are missing.

I can't declare the hashed table with all the fields, because not all of them are the right type to do a collect...maybe it's to early this morning

Read only

0 Likes
2,612

Hi Michael,

I was following this long chain of reply regarding the Performance Improvement of Loop Statement in the internal Table. I have one suggestion for you. Can you try the option of using Field Symbols instead of normal Internal Table Structures?

Hope this will help.

Thanks,

Samantak.

Read only

0 Likes
2,612

Hi,

you can declare all the fields even if they are of type i, p, f and if you dont want them to be added by the collect statement and also keeping in mind that those fields value are same for for all the records...

you can declare them and make them key field... they will not get summed up in the hashed summed table...

Regards,

Siddarth

Read only

0 Likes
2,612

Hi Siddarth,

sure, I can declare them as key-fields - but then my collect won't work correctly (it won't collect fields because all key fields must be the same to collect - what they aren't).

Read only

0 Likes
2,612

hi,

that is true.... collect will not work correctly if the values are not same and I had mentioned it in earlier post also...

so is mandatory that you require all the fields in the summed table... can't you do the other way that once you get the summed records you can use both the tables taking the entries from the first table and finding if the record exist in another using read statement and getting the summed value from the summed table and then proceeding further

otherwise you can do it this way also....

loop at gt_output....

read from gs_summed with key....

if sy-subrc = 0.

add 1 to count_field in the gs_summed table

add other required fields...

modify the record in the table.....

else.

append the record in the table.

endif.

Regards.

in this case you need not write loop within a loop.... just a read statement and your issue will be much better than what it was earlier with loop inside a loop....

Siddarth

Read only

Former Member
0 Likes
2,612

> I can't declare the hashed table with all the fields, because not all of them are the right type to do a

> collect

what extra fields are there you never mentioned them:

+ you can add characterlike extra fields, they are of course not summed up.

+ other numercial field would also be summed up, maybe you don't what that.

It is possible to write a COLLECT by yourself whwere you can controll what is added and what not.


sort table
loop 

if ( ....keyfields NE keys_old )
  append ...
  key_old = keyfields
else.
   move
   add 
 endif.
endloop.

Siegfried

Read only

0 Likes
2,612

Hi Siegfried.

+ you can add characterlike extra fields, they are of course not summed up.

But I have to declare them as key fields - and then the collect won't work correctly

+ other numercial field would also be summed up, maybe you don't what that.

Exact

Now I work with my summed table (with colllect and just the 6 fields), and, after that, I make a Loop over this table, read the other one (with read) and then insert this to a third table.

It works - maybe there is a possibility to do this faster...I will check when I got the time.

Thank you very much for your inputs

Read only

Former Member
0 Likes
2,612

I have blogged about this since i faced this problem also, i think you might have solved your problem my now but i am sure this might help u when it comes to future development

[ http://nafran.blogspot.com/search/label/performance|http://nafran.blogspot.com/search/label/performa...;

Nafran