‎2008 Nov 10 7:12 PM
I have a table with 20,000 to 60,000 records. Using a select up to n rows statement how can i select 1000 records at a time, process the data, then grab the next 1000?
‎2008 Nov 10 7:14 PM
I guess that's not a performance oriented approach..
grab all into one internal table first and then process in batches...
‎2008 Nov 10 7:16 PM
Even if you use Select upto 1000 rows all the records matching the where condition is retrieved into the buffer but only 1000 records are fetched from buffer.
better not to use select upto in this case.
‎2008 Nov 10 7:22 PM
let me re-try this. currently i select all the records into an internal table. The problem is that i have to read that table line by line to pass it to a bapi causing a time out error. How can i break my itab so it process the information in batches?
‎2008 Nov 10 7:30 PM
Try something this way
loop at itab.
move-corresponding itab to itab1.
v_no = v_no + 1.
if v_no = 1000.
job_open.......
submit <another report and call the bapi inside this program> and pass the internal table itab1.
job_close......
refresh itab1.
clear v_no.
endif.
endloop.
a®
‎2008 Nov 10 7:44 PM
The problem is that i have to read that table line by line to pass it to a bapi causing a time out error.in the read table use the Binary Search addition for sure... it reduces the processing time of read to half.
‎2008 Nov 10 8:12 PM