Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Large tables where large numbers of records are frequently accessed.What are the optimization techniques we can use?

Former Member
0 Kudos

Whenever we create database tables,we define what kind of buffering it should possess.If buffering is activated for a table,we should choose from one of three buffering types.

Single record buffering is recommended when we access fewer records from large tables. Generic buffering is recommended for language-specific and client-specific tables. Full buffering is recommended for small tables which are read frequently and written rarely.

Now,what are the optimization techniques we can use when we have large tables and where records are accessed very frequently. What type of buffering is recommended and how can we optimize our database tables for such situations.

1 ACCEPTED SOLUTION

TuncayKaraca
Active Contributor
0 Kudos

You can create secondary index based on selection criteria (WHERE) fields. It will help to reach data faster instead of full scan.

Check this: Indexes

4 REPLIES 4

TuncayKaraca
Active Contributor
0 Kudos

You can create secondary index based on selection criteria (WHERE) fields. It will help to reach data faster instead of full scan.

Check this: Indexes

Former Member
0 Kudos

Hi Bobby,

Don't do any buffering for these type of tables. Since its your Custom table,you can use as many as primary keys as possible for fetching/working with table entries.

You can use pool concept for the same.

Take example of Standard tables, such as EKKO,CDPOS etc...

Former Member
0 Kudos

Hi,

Based on your field selection, search of available secondary indexes given by SAP. If still it does not satisfy your criteria, only then go for creation of secondary indexes.

Just a word of caution - Creation of secondary index will create another database table in which the records are sorted based on the fields you specify while creating a secondary index. The space this additional table will consume will be size of fields specified in the index + number of records + some overhead.

Also, ensure your BASIS guy updates the database statistics.

Danish.

Former Member
0 Kudos

As memory is cheap today, you want the data to be buffered in the database. For example with oracle you want to have a decent sized buffer cache. Buffering on database side optimizes both read and write accesses.

Of course you want to have to have a well suited index to prevent expensive access paths. Indexes can be compressed to further optimize space usage. And you should deactivate unused indexes on large tables.

Cheers Michael