Showing results for 
Search instead for 
Did you mean: 

How to improve performance to insert large amount of data to a table?

Former Member
0 Kudos

There is a table mytab with primary(3 columns) and one index on one column,

A sp is created with a cursor to insert data to this table in following steps:

truncate data in mytab

insert data into mytab one row by one row with a cursor.

When sp done, the data and index in mytable would be 3G.

It is very slow. almost one minute only insert about 1000 rows in mytab.

If not change the code of SP, how to improve performance for this case?

Before insert data, if drop the index could improve performance?

Accepted Solutions (0)

Answers (3)

Answers (3)

Active Contributor

Dropping the index would reduce the amount of i/o that had to be done and so improve performance, but I don't think it would help much.  I don't think the insert itself is what is slowing things down.

In general, are you really sure you can't rewrite the proc?

If you can't change the proc, you are left with trying to make it run more efficiently.  So look at performance and turning on the queries in the proc.  Are they using indexes?  Are statistics up to date?  Would an index on a different set of columns be better?


Former Member
0 Kudos

I guess issue is inserting one row at a time. Can't you write the cursor into a set based select statement. if not, then try to commit every say 1000 or 10000 rows or 100K rows instead of every row. This should improver performance.

You might want to drop index before insert and rebuild it after that as well.

0 Kudos

Maybe use "set delayed_commit on"?

I've used it a couple times, and it does seem to help.

There are a few things to note about this, described here: