Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Program optimization

ashish_arora6
Participant
0 Likes
1,300

Dear Gurus,

I have created one simple program which reads the file from application server but file has 800,000 lac entries.

Execution time for this program in batch is more than 15  hrs.

Could you please tell me there is  better option so that i can reduce the processing time.

I have checked the code, and there is no possiblity for further optimization in ABAP

BR

Ashish Arora

Moderator message: Please use internationally acceptable number systems.

Message was edited by: Suhas Saha

11 REPLIES 11
Read only

Former Member
0 Likes
1,255

Hi Ashish

Could you please explain what "8 lac entries" means.

How many records are you processing when this program runs.

Would not hurt to include your code if you wish for the SDN community to be able to assist you better.

Regards

Arden

Read only

0 Likes
1,255

Hi Arden,

There are 800,000 entries.

Due to Client issues, i cannot upload the code.

My program is simple, there is no inner loop and no select statemnt in loop.
Please help.

BR

Ashish Arora

Read only

RaymondGiuseppi
Active Contributor
0 Likes
1,255

In your performance analysis, does the long process time come from DATASET operation or from execution  of following code (800000 records is not so much, but 800000 call of BAPI may be?)

Also if you wrote

there is no possiblity for further optimization in ABAP

you are no longer in the good space, are you?

Regards,

Raymond

Read only

0 Likes
1,255

In addition to Raymonds comments

Are you outputting the 800,000 entries to table(s)...if so what is the frequency of the Commit cycles you are running?

Are you able to adjust memory consumption as entries are processed?

Regards

Arden

Read only

0 Likes
1,255

I have to generate another file from incoming file. I havent used commit statement isn th ecode.

Read only

Former Member
0 Likes
1,255

Hi,

You don't need to place ur original code. Just copy the logic how you handled the file. So that it will be helpful for us to help u.

Read only

giancarla_aguilar
Participant
0 Likes
1,255

Hello.

I can't see your code so I will just assume.

In the assumption that you are looping into an internal table to write to the dataset, try to use LOOP AT itab ASSIGNING <fs_wa>.  Pointers to the internal table are way more efficient than header lines and work areas.

Just a tip.

Read only

vinodkumar_thangavel
Participant
0 Likes
1,255

Hi,

Fetching data in lac wont take much time and check whether the query is working based on the key fields.

Regards,

Vinodkumar.

Read only

Former Member
0 Likes
1,255

HI Ashish,

You must have taking the input file data into an internal table using READ DATASET.......

Check in your program about the table/transaction updation done inside of loop.

Put validations so that unnecessary records will be skipped.

Regards,

Pravin

Read only

Former Member
0 Likes
1,255

You say that you have "checked the code", but you don't say how. Did you do a runtime analysis? That will show you where the bottleneck is.

Reading 800,000 records will not take 15 hours.

Rob

Read only

Former Member
0 Likes
1,255

Hi,

You may need performance trace,

Please try these tools listed here:

Performance (ABAP, database, network)

  • Single Transaction Analysis (ST12). It groups SE30 and ST05 transactions.
  • SQL Performance Trace (ST05)
    • Explain Plan
  • ABAP Runtime Trace (SE30) (note:transaction SAT should replace SE30 from release 7.02)
  • Global Performance Analysis (ST30)

ABAP Test and Analysis Tools - ABAP Development - SCN Wiki

then locate the code that cause the long running time.


Be open minded, there is always an alternative to make it better.


Good luck.

Message was edited by: Kok Wei Wong