2009 Jul 14 7:10 AM
Hi Experts,
I have a requirement where in I want some records to be inserted in a database table based on some selection criteria.
Since the no of records to be inserted is huge , I want them to be broken into parallel jobs.Eg, if there are 1000 records , i want to have 10 parallel jobs with 100 records each. I read in the forums that ther is a way of doing this by parallel processing , but I am not getting how to implement that.
I will just elaborate what I am trying to do, Please provide your inputs in clearing my doubts.
1. Selects the records from the database table in a internal table.
2. Loop at the internal table.
3. Break internal table into small internal tables
4. Create a RFC enabled FM " Zinsert" which inserts the records in the database table.
5. Call fucntion ZINSERT starting new task '001'
destination 'None' in group 'parallel_generators'.
Now my doubt in suppose I have 10 small internal tables, So 10 parallel sessions will be started. But if 5 sessions have some problem in databse insertion , how will I manage to get those records?
Secondly , how will I come to know that all the parallel process have ended?
Thirdly, are there some other function modules which needs to be called before parallel processing.
Please provide some input.
Help will be appreciated
Best regards
Sourabh verma
2009 Jul 14 7:21 AM
Processing on the same table wont be possible i guess.
because the first set of records in internal table uses the lock on the data base table and other processes cant do the update on same table.
its better to update the database table in small chunks in a loop.
warm regards,
Sumanth
2009 Jul 14 7:34 AM
Hii Sumanth,
Mine is a customised database table, I have not created any lock objets for it. I simly want to insert records in it.
2009 Jul 14 8:44 AM
2009 Jul 15 7:08 AM
2009 Oct 02 9:44 PM
Hello,
I have a similar problem but with Retrieval from DB. If one of the RFCs fail, how is the error handling done? Could you please elaborate on your solution.
Thanks,
Minhaj.