Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Logical Dbase Performance

former_member426550
Participant
0 Likes
575

hi fellow programmers..

i developed a payroll program which work fine at the development server..

upon transport to QAS server..i noticed that the program is not performing well when dealing with large number of data..

i also tried running it with minimal data(2K records) at qas server and works fine since i applied most of the performance tunning guide...

is there any way to prevent this?when i debug it i found out that for 1-3k records the program is doing well, but upon reaching 4k records above, the program is running slow.

my plan is to divide the data by 2k so that from start of execution to finish, the program will run fast...

any other approach?or any suggestions?

thanks for reading..

1 ACCEPTED SOLUTION
Read only

MarcinPciak
Active Contributor
0 Likes
542

Hi,

PY extraction data is long lasting process. If you run it for so huge population it will extend even more. I usually don't bother how long it will take (someone else will run it anyway;) ). Normally user can provide not the population itself but PY area or personal area instead (so the report would run couple times but faster).

What is to be noticed here is that you may reach timeout as run time will be too long. I suggest to use FM SAPGUI_PROGRESS_INDICATOR in order to avoid such situation.

Regards

Marcin

3 REPLIES 3
Read only

MarcinPciak
Active Contributor
0 Likes
543

Hi,

PY extraction data is long lasting process. If you run it for so huge population it will extend even more. I usually don't bother how long it will take (someone else will run it anyway;) ). Normally user can provide not the population itself but PY area or personal area instead (so the report would run couple times but faster).

What is to be noticed here is that you may reach timeout as run time will be too long. I suggest to use FM SAPGUI_PROGRESS_INDICATOR in order to avoid such situation.

Regards

Marcin

Read only

former_member426550
Participant
0 Likes
542

hi...

what i did is create another program then used submit.

get pernr.

add 1 to ctr.

gt_pernr-low = p0001-pernr.

append gt_pernr.

if ctr eq 30.

submit payroll with gt_pernr.

endif.

by doing this was able to control the payroll program to process 30 employees only ...

save running time by 20-30 percent..

Read only

former_member426550
Participant
0 Likes
542

Solved by using batch processing