2015 Jan 28 10:42 AM
Dear Gurus,
I have created one simple program which reads the file from application server but file has 800,000 lac entries.
Execution time for this program in batch is more than 15 hrs.
Could you please tell me there is better option so that i can reduce the processing time.
I have checked the code, and there is no possiblity for further optimization in ABAP
BR
Ashish Arora
Moderator message: Please use internationally acceptable number systems.
Message was edited by: Suhas Saha
2015 Jan 28 10:53 AM
Hi Ashish
Could you please explain what "8 lac entries" means.
How many records are you processing when this program runs.
Would not hurt to include your code if you wish for the SDN community to be able to assist you better.
Regards
Arden
2015 Jan 28 10:57 AM
Hi Arden,
There are 800,000 entries.
Due to Client issues, i cannot upload the code.
My program is simple, there is no inner loop and no select statemnt in loop.
Please help.
BR
Ashish Arora
2015 Jan 28 11:05 AM
In your performance analysis, does the long process time come from DATASET operation or from execution of following code (800000 records is not so much, but 800000 call of BAPI may be?)
Also if you wrote
there is no possiblity for further optimization in ABAPyou are no longer in the good space, are you?
Regards,
Raymond
2015 Jan 28 11:19 AM
In addition to Raymonds comments
Are you outputting the 800,000 entries to table(s)...if so what is the frequency of the Commit cycles you are running?
Are you able to adjust memory consumption as entries are processed?
Regards
Arden
2015 Jan 28 1:01 PM
I have to generate another file from incoming file. I havent used commit statement isn th ecode.
2015 Jan 28 11:24 AM
Hi,
You don't need to place ur original code. Just copy the logic how you handled the file. So that it will be helpful for us to help u.
2015 Jan 28 11:29 AM
Hello.
I can't see your code so I will just assume.
In the assumption that you are looping into an internal table to write to the dataset, try to use LOOP AT itab ASSIGNING <fs_wa>. Pointers to the internal table are way more efficient than header lines and work areas.
Just a tip.
2015 Jan 28 1:01 PM
Hi,
Fetching data in lac wont take much time and check whether the query is working based on the key fields.
Regards,
Vinodkumar.
2015 Jan 28 6:25 PM
HI Ashish,
You must have taking the input file data into an internal table using READ DATASET.......
Check in your program about the table/transaction updation done inside of loop.
Put validations so that unnecessary records will be skipped.
Regards,
Pravin
2015 Jan 28 7:03 PM
You say that you have "checked the code", but you don't say how. Did you do a runtime analysis? That will show you where the bottleneck is.
Reading 800,000 records will not take 15 hours.
Rob
2015 Jan 29 6:18 AM
Hi,
You may need performance trace,
Please try these tools listed here:




(note:transaction SAT should replace SE30 from release 7.02)ABAP Test and Analysis Tools - ABAP Development - SCN Wiki
then locate the code that cause the long running time.
Be open minded, there is always an alternative to make it better.
Good luck.
Message was edited by: Kok Wei Wong