Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

need help with performance

Former Member
0 Likes
1,214

Please use a meaningful subject in future

HI

i have to create table with users (10.000) users and additional data,and there is a process

that read the table data and see if something changing update the table.

Most of the data is to are not changing frequently like 90% from the time .

What is your suggestion to use,i don't know if fully buffered table can help here because the table size.

I'll like to get your suggestion's if to use table buffer or shared memory object ?

best regards

Nina

Edited by: Nina C on May 19, 2009 10:39 PM

Edited by: Matt on May 20, 2009 2:55 PM

1 ACCEPTED SOLUTION
Read only

former_member207438
Participant
0 Likes
1,188

Why are you anticipating performance problems?

10000 records does not sound like a lot.

I'd suggest you create your table without buffering and test the performance on your development system.

Then, if you have a particular SELECT statement, for example, that runs slow, you may need to address a performance problem.

16 REPLIES 16
Read only

former_member207438
Participant
0 Likes
1,189

Why are you anticipating performance problems?

10000 records does not sound like a lot.

I'd suggest you create your table without buffering and test the performance on your development system.

Then, if you have a particular SELECT statement, for example, that runs slow, you may need to address a performance problem.

Read only

0 Likes
1,188

HI ,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

Former Member
0 Likes
1,188

hi,

if the user or any one frequently accessing the table it is better to use buffering.

if they update or read table rarely then there is no need to use buffering.

table size depends upon how the data is updation and what is the amount of data update in one run..

for one run if they update more than 10000 records create table size 2 or 3.

if they update less records at a time keep table size 0.

Prabhu

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

HI Prabhu,

Thanks,

What do u think about single row buffer or shared memory object ?

By the way we can use the table until 30000 rows.

Best regards

Nina

Read only

0 Likes
1,188

Hi Nina,

I believe using a generic buffering table will be the best one...

you can specify the number of key fields based on which the generic buffering has to be done.... that will also improve on performance....

and I also think that since it is on around 10-20 thousand records even fully buffered will also be fine...

Regards,

Siddarth

Read only

0 Likes
1,188

HI Siddharth

Thanks,

i read that generic buffer is recommended until size of 10MB , i think that the table is

for 30000 records ,

1. how i can calculate it ?

2. about generic buffer ,if i understand from the SAP help i need for that do declare specific

area that be buffered but i don't know which area ,in any given time any records in the table

can be read or change,so how u suggest to do so ?

Best regards

Nina

Read only

0 Likes
1,188

alright...

so if that is the case then single buffering will not help you that much..... as for the first time it will hit database again and again for new records....

I think its better to go for fully buffered or generic buffering...

to calculate 10MB just check the size of the columns means size of each row/record. that can be calculated by fields you have declared and i think for 10MB it can almost accomodate more the 10000 records.... still depending on the size of each row...

Either of fully/generic buffering would be fine for the performance...

Regards,

Siddarth

Read only

0 Likes
1,188

HI Siddharth,

about generic buffer ,if i understand from the SAP help i need for that do declare specific

area that be buffered but i don't know which area ,in any given time any records in the table

can be read or change,so how u suggest to do so ?

Best regards

Nina

Read only

0 Likes
1,188

Thats true that you have to mention a specific area.... that specific area is nothing but the number of key fields you want to mention...

now say for example... if you look at the table sflight it has 4 key fields but i want the generic buffering to be done with respect to carrier ID, so for one carrier id there may be many records on connid and fldate. so if i give carrid in my select query all the records of that carrier id only will be buffered....

so the key specification i would mention it as 2. that is from the left based on 2 columns it should buffer.

so give the generic buffering in such a way that it will buffer most of the records at a given time. so next time if you use any f those records it would be faster.

so if you have 5 key fields give the area ie. key specification to 1 or 2... so that it buffers most of the record at a go....

I think going with generic buffering will be better....

Regards,

Siddarth

Read only

Former Member
0 Likes
1,188

> Most of the data is to are not changing frequently like 90% from the time .

95% not changing, means 10% changing, actually I do not understand what you count, rows or executions?

Anyway, 10% is too high for table buffering, especially for fully buffered tables. For example

10.000 records, full buffered, i.e first SELECT reads whole table.

90 records from buffer, then change, buffer invalidated! all 10.000, after a while buffer filled again.

=> with buffering worse than without.

How often is table changed? How large is your system, several application servers?

What are the statements?

Siegfried

Read only

matt
Active Contributor
0 Likes
1,188

Please use a meaningful subject in future