2012 Nov 16 6:25 AM
Hi,
As per the requirement, in our program a database table need to be updated. Now user wants to run multiple job ( with different store id) parallel with this program. So, I have locked the table based on the key fields and made a commit work within loop. Can anyone suggest, if there is any better solution instead of committing in the loop. Please check below for the code part. Thanks in advance.
LOOP AT pi_data INTO lwa_data.
CALL FUNCTION 'ENQUEUE_EZ_ZSTOCK_HYBRIS'
EXPORTING
mode_zstock_hybris = 'E'
mandt = sy-mandt
article = lwa_data-article
store_id = lwa_data-store_id
EXCEPTIONS
foreign_lock = 1
system_failure = 2
OTHERS = 3.
IF sy-subrc IS INITIAL.
*Modify table
MODIFY zstock_hybris FROM lwa_data.
* Commiting in the loop, as multiple job will be executed parallely.
COMMIT WORK.
* Unlock the table
CALL FUNCTION 'DEQUEUE_EZ_ZSTOCK_HYBRIS'
EXPORTING
mode_zstock_hybris = 'E'
mandt = sy-mandt
article = lwa_data-article
store_id = lwa_data-store_id.
ENDIF.
CLEAR lwa_data.
ENDLOOP.
2012 Nov 16 6:32 AM
Enqueue the table outside of the Loop...
In between Loop and Endloop... Collect the data into a Table..
After Endloop.. Commit the data to the DB with Modify from table..
Dequeue the table...
Hope this helps
2012 Nov 16 6:57 AM
Hi Venkat,
Thanks for your reply. Actually initially , it was like that way only. But now customer wants to execute multiple job in parallel and in each job , the table will contain huge number of data with different store id. So, we planned to lock the table for a particular store which is updating in that run and so we were thinking if a commit of a job will make any problem to the commit any other parallel job.
2012 Nov 16 8:10 AM
Ok Got it Subhajit Pal...
If you dont commit within the loop... You will get Locking Issues...
if you COMMIT outside the loop, most probably only one record may get committed to DB...
So committing there will not create any problems for you..
Also make use of the following FMs to do so...
BAPI_TRANSACTION_COMMIT
DB_COMMIT
Hope this helpss...