‎2015 Jun 10 5:10 PM
I need to process 10 lakh entries in a program which takes lot of time may be days together.So thinking of splitting the records into smaller size and then processing it, like dividing the 10 lakh records into may be 2 lakh records within the program and process them parallely.Is it going to be good to process parallely in the program using "Parallel processing" concept or creating 5 background job within the program and make them run in the background? Need advice
With thanks & regards,
Naveen
‎2015 Jun 10 5:44 PM
hi,
I think multiple background job of same program will be easiest & quicker way , but you need to consider about database locks .
Raj Patel
‎2015 Jun 10 6:11 PM
Naveen,
It depends on your requirement. Do you want the results back from the processing to write an output or calculate something? If yes, you cannot use background processing and you need to run parallel threads in which you have an option to get the results back from the function.
Thanks,
Vikram.M
‎2015 Jun 10 6:36 PM
Hi Naveen,
I also faced same kind of issue and resolved using 1st option by creating a wrapper program to call original program and collecting errors in application log and error table for reprocessing the error records.
1. If records are independent on each other and you don't need to show the output then you can go by scheduling multiple background jobs based on volume of records per job. In this case, if you want to track list of errors and other information by creating a application log.
2. if you are collecting the data from database and then for processing of your logic if it takes larger time then you can create a function module for you processing or calculation and execute in some other destination.
Get all the results and maintain some temp db table and update using that table by using proper locking mechanism.
Thanks,
SaiKrishna