Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Does Runtime Errors Decrease System Performance.

Former Member
0 Likes
1,122

Dear/s,

I have a situation wherein there are approximately 1000 runtime-errors (short dumps) in the production

environment on a daily basis.

I would like to know if these runtime-errors decrease system performance. IF yes, how?

Best Regards,

Vikas

1 ACCEPTED SOLUTION
Read only

Former Member
0 Likes
887

they will definitely cause problems for the process which is the reason for the error.

For all other processes they will have no effects, maybe in some very special cases.

You will of course waste performance, if you run everything in dialog first, finish part of the tasks while other dump with time out and

you reschedule the dumped ones as batch. These dumps shoud be avoided.

4 REPLIES 4
Read only

Former Member
0 Likes
891

they will definitely cause problems for the process which is the reason for the error.

For all other processes they will have no effects, maybe in some very special cases.

You will of course waste performance, if you run everything in dialog first, finish part of the tasks while other dump with time out and

you reschedule the dumped ones as batch. These dumps shoud be avoided.

Read only

HermannGahm
Product and Topic Expert
Product and Topic Expert
0 Likes
887

Hi,

1000 dumps per day in a productive system is clearly too much.

Whether it affects performance or not or to what degree it affects performance depends

on the type of the dump.

I look at it from this perspective:

Every program consumes (more or less) ressources. If the program does not produce a

result from a business perspective because it dumps the consumed ressources are waisted. Waisted ressources

can be a problem at system level (affecting other programs).

One exmaple i have seen recently:

The TIME_OUT short dump occured very frequently (1009 times per week). The time out limit was 20 minutes (1200 seconds).

Allmost all of the dumps were caused by a program consuming CPU on the database server. These program was run by

many users in parallel. The program aborted before producing a meaningful result. 1200 seconds of waisted database CPU time multiplied with 1009 dumps is eqal to ~336 hours database CPU time per week. That is 48 hours database CPU time per day. That is 2 database CPUs per day occupied for running this program until it runs in time out with no relevant results. Now you could put a price

tag on the CPUs...

Other examples:

Memory dumps (e.g. TSV_TNEW_PAGE_ALLOC_FAILED) is bad for 3 possible reasons. One is the memory consumption on the application server and second maybe pinning a process to a user (priv) and third possibly a big database selection causing displacing other users data in the cache of the database.

but even dumps that do not look so bad at first sight can mean a waste of ressources.

In the end: so much dumps are a serious quality issue that clearly can affect system performance as well.

Kind regards,

Hermann

Read only

0 Likes
887

Thanks for answering this question with a comprehensive example.

Read only

ravi_lanjewar
Contributor
0 Likes
887

Hi,


I would like to know if these runtime-errors decrease system performance. IF yes, how?

What is user do generally when error occurs?

When any error occur in program or it gives runtime error in program. User execute the same again and again

program for getting result. it program read data and turminate middle of somthing executing same program,

it will load on database for reading data again and again.

When error occur the system routine run to note the error and store the details of error in system. It addation

over head on system. It is also take some process for execution and storage.