‎2010 Jan 20 11:50 PM
Hi,
I am extracting data from a table which has more than 25 million records without using any where condition.
45000 records in a file, in multiple files.
What is the best way to write the select statement.
Thanks,
fract
‎2010 Jan 21 6:04 AM
Hi ,
Your should try to do analysis of your system and requirement to find the best approach.
Also, you should proceed with runtime analysis and trace analysis of your program.Check which index is being used by select statement.
you can try PACKAGE concept with the select query.
Also, try to analyse the impact of the changes that you have made by comparing with previous results.
Search SDN on performance tuning , there are many good posts avaliable on this.
Hope this helps you.
Edited by: Harsh Bhalla on Jan 21, 2010 11:35 AM
‎2010 Jan 21 5:52 AM
‎2010 Jan 21 5:54 AM
Hi,
You can also use PACKAGE concept as an addition to the select query. (please see F1 help on this syntax). You can specify the package size, and the system will fetch the record in those bunches.
regards,
amit m.
‎2010 Jan 21 6:04 AM
Hi ,
Your should try to do analysis of your system and requirement to find the best approach.
Also, you should proceed with runtime analysis and trace analysis of your program.Check which index is being used by select statement.
you can try PACKAGE concept with the select query.
Also, try to analyse the impact of the changes that you have made by comparing with previous results.
Search SDN on performance tuning , there are many good posts avaliable on this.
Hope this helps you.
Edited by: Harsh Bhalla on Jan 21, 2010 11:35 AM
‎2010 Jan 21 5:18 PM
When writing into files till 1 million is ok. When it comes to more than 1 million, it is taking more time....
It writes first file from 1 to 45000. after writing the first file, for second file, it comes to Select statement again and reads 90000 records but skips first 45000records. like this it is doing for all files....
second file 45001 to 90000
third file 90001 to 120,000
....
....
....
......
Select * from....
if sy-dbcnt > last_count_records.
write into internal table.....
endif.
Endselect....
Is there any other way to improve performance.
‎2010 Jan 21 6:08 PM
Don't know why you're extracting 25,000,000 records, but package size would be my first choice....something like:
types: gtyp_int type i.
data: itab type table of <dbtab>.
parameters: p_pkg type i default 45000.
start-of-selection.
perform myform.
form myform.
data: file_cnt type gtyp_int.
select * from <dbtab> into table <itab>
package size p_pkg.
file_cnt = file_cnt + 1.
do something with itab contents
case file_cnt.
when 1.
append lines of itab to .... "or do a table copy.
when 2.
....
endcase.
endselect.
endform.
‎2010 Jan 22 2:24 PM
What is the best method in the following two....I never used Open curosr, I don't know how it works. I heard that when we use "Open cursor", does it reads only one time the database and cache in application server?
Open cursor......
(OR)
select * from <dbtab> into table <itab>
package size p_pkg.
‎2010 Jan 22 2:39 PM
Rather than saying what you "heard", why don't you just check the documentation?
Rob