Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Performance of background jobs?

Former Member
0 Kudos
802

I am running in a job in PROD. It took 21 hours to process an input file of 300,000 records.

The processing is quite simple. The main logic is attached.

Any thought on why this would be runnning so slow?

I know that the BAPI does a significant amount of processing and database updates. It looks like this is where the bottleneck is.

Can there be any gains made in using smaller input files ? I am wondering if there are memory management issues with a large file and all of the processing for each record.

Any thoughts...J.J

Here is the main logic:

DATA: l_equipment LIKE bapi_itob_parms-equipment,

l_data_install LIKE bapi_itob_eq_install_ext,

l_bapiret2 LIKE bapiret2.

LOOP AT it_intab.

  • Get Child Equipment Number

SELECT SINGLE equnr INTO l_equipment

FROM equi

WHERE sernr = it_intab-child_sernr

AND matnr = it_intab-child_matnr.

IF sy-subrc <> 0.

CLEAR it_errtab.

MOVE-CORRESPONDING it_intab TO it_errtab.

MOVE 'Error selecting child from EQUI'

TO it_errtab-error.

APPEND it_errtab.

ADD 1 TO g_num_errors.

CONTINUE.

ENDIF.

  • Get Parent Equipment Number

SELECT SINGLE equnr INTO l_data_install-supequi

FROM equi

WHERE sernr = it_intab-parent_sernr

AND matnr = it_intab-parent_matnr.

IF sy-subrc <> 0.

CLEAR it_errtab.

MOVE-CORRESPONDING it_intab TO it_errtab.

MOVE 'Error selecting parent from EQUI'

TO it_errtab-error.

APPEND it_errtab.

ADD 1 TO g_num_errors.

CONTINUE.

ENDIF.

CALL FUNCTION 'BAPI_EQUI_INSTALL'

EXPORTING

equipment = l_equipment

data_install = l_data_install

IMPORTING

return = l_bapiret2.

  • Check l_bapiret2 to see if everything worked OK

IF l_bapiret2-type = 'S' OR l_bapiret2-type = ' '.

ADD 1 TO g_num_recs_proc.

  • Commit Work

CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'

EXPORTING

wait = 'X'.

ELSE.

CLEAR it_errtab.

MOVE-CORRESPONDING it_intab TO it_errtab.

MOVE l_bapiret2-message(80) TO it_errtab-error.

APPEND it_errtab.

ADD 1 TO g_num_errors.

ENDIF.

ENDLOOP.

1 ACCEPTED SOLUTION

former_member181962
Active Contributor
0 Kudos
697

remove your selects inside the loop and use for all entries instead.

use individual move statements in place of move-correspoding.

Regards,

ravi

9 REPLIES 9

former_member181962
Active Contributor
0 Kudos
698

remove your selects inside the loop and use for all entries instead.

use individual move statements in place of move-correspoding.

Regards,

ravi

0 Kudos
697

Hi,

try to avoid select inside the loop, try to use the For all entires out side and use the read statement to get the data.

Regards

vijay

Former Member
0 Kudos
697

main performance reason is SELECT STATEMENTS IN LOOPS. So as you increase the no of records in the input file the processing time increases.

SO WRITE all SELECTS before the loop and use READ TABLE to read the particular contents from that table.

for that define an internal table with the required fields.

for example :

data: begin of it_equi occurs 0,

sernr type equi-sernr,

matnr type equi-matnr,

equnr type equi-equnr,

end of it_equi.

here above, i defined internal table IT_EQUI,assuming SERNR,MATNR are the key. if you have any other, add them to the definition.

*--This below SELECT is for pernr

if it_intab[] is not initial.

SELECT sernr

matnr

equnr INTO table it_equi

for all entries in it_intab

FROM equi

WHERE sernr = it_intab-child_sernr

AND matnr = it_intab-child_matnr.

*--This below SELECT is for parent pernr

SELECT sernr

matnr

equnr

<b>APPENDING table it_equi</b>

for all entries in it_intab

FROM equi

WHERE sernr = it_intab-child_sernr

AND matnr = it_intab-child_matnr.

if sy-subrc = 0.

sort ita_equi by sernr matnr.

endif.

endif.

LOOP AT it_intab.

  • Get Child Equipment Number

*SELECT SINGLE equnr INTO l_equipment

*FROM equi

*WHERE sernr = it_intab-child_sernr

*AND matnr = it_intab-child_matnr.

*--Instead of above select single, use read table IT_EQUI to findout record exists or not.

<b> READ TABLE IT_EQUI WITH KEY

sernr = it_intab-child_sernr

matnr = it_intab-child_matnr

BINARY SEARCH

TRANSPORTING NO FIELDS.</b>

IF sy-subrc <> 0.

CLEAR it_errtab.

MOVE-CORRESPONDING it_intab TO it_errtab.

MOVE 'Error selecting child from EQUI'

TO it_errtab-error.

APPEND it_errtab.

ADD 1 TO g_num_errors.

CONTINUE.

ENDIF.

*--like this also replace second SELECT IN THE LOOP.

USE THE SAME ABOVE read table but now send parent no to the same it_equi.

ENDLOOP.

Now you will see the difference in execution

Regards,

Srikanth.

Added internal table definition & BINARY SEARCH in READ TABLE.

Message was edited by: Srikanth Kidambi

<b></b>

Message was edited by: Srikanth Kidambi

Former Member
0 Kudos
697

NEVER NEVER NEVER NEVER

Code aSelect statement inside a loop .....

Former Member
0 Kudos
697

Hey,

One more thing you could do is to COMMIT after every 100 records instead of using the COMMIT BAPI for each record in the internal table.

-Kiran

0 Kudos
697

Hi JJ,

you might avoid some select singles, but I guess the main runtime is in the BAPI - which needs every time a commit work.

But in general: 300 000 entries in 75600 is about 0,25 seconds per booking (if you don't have too much errors) - that's not so slow. Maybe you get a factor 2, but I wouldn't expect too much.

You still can use parallel execution - but that's just spreading the same runtime over several processes, so that you don't have to wait so long.

Regards,

Christian

0 Kudos
697

I agree with Christian. I don't think there's much you can do to improve the performance. I would add that if you do submit this multiple times in parallel, you will likely run into locking problems.

Rob

0 Kudos
697

I am sure that all of the time is being consumed by the BAPI and for a BAPI you must have the commit every record.

I am moving the select singles out of the loop as suggested but am not expecting great things.

thanks for all the input

J.J

MeghaSharma
Newcomer
0 Kudos
672

Can anyone suggest if we can overcome this scenario using Odata and how to achieve that