‎2007 Sep 05 2:31 PM
hi
I want to enter single values into Select-Options on the selection screen
I sometimes need to enter more than 20,000 entries as single values in the selection screen.
But when I run the program for more than 6K single values I face a dump
Can you please help me how to remove this limitation
Points for everyone...
‎2007 Sep 05 2:40 PM
Does the dump appear at the time of an SQL read. You need to use the FOR ALL ENTRIES command if that is the case.
I do not know if you can do that using a read, so you will have to use an internal table.
‎2007 Sep 05 2:45 PM
Hi grame,
1. 20 K values... thats too high.
2. The R/3 consutrcts the SQL for select options using
field1 = value1 or field2=value2
field1 in (value1,value2)
etc.
3. The sql will be very long, if values will be high. and hence the error.
4. use FOR ALL ENTRIES Instead
regards,
amit m.
‎2007 Sep 05 2:55 PM
Actually I am buidling a wrapper program which passes on single values to selection screen of standard transaction CFM1.
I dont think I can pass on using any other methods.. I dont think this works in background
Any basis settings for this ??
‎2007 Sep 05 2:47 PM
Hi,
The best way to get more entries is to upload using a file either in presentation or applicationserver.
It is not possible to use more entries in selection screen.
The DB has got its own limitations.
Regards,
Viswanath Babu
‎2007 Sep 05 2:48 PM
the internal tables of SAP have some limitations...
so you can not give more that few thousands of records at a time as input...
if u can give input as range .. & execute report in Background.. then your report should work fine
‎2007 Sep 05 3:07 PM
The 6K limit you are talking about is not a limitation on the Select-Options or on Ranges tables. It is a limitation on the amount of SQL that can be sent to the database from the application server in a single Select command.
Simplest way to fix is to define a Ranges table the same as your select-options, and then:
loop at sel_opt into wa_sel_opt.
w_count = w_count + 1.
append wa_sel_opt to range_tab.
if w_count > 500. "<< Choose appropriate number to keep below 6K
select field1 field2 "<< fill in field names
from table
APPENDING table it_result
where field in range_tab.
clear: range_tab[], w_count.
endif.
endloop.
if w_count > 0. "<< Get the last few records
select field1 field1 "<< fill in field names
from table
APPENDING table it_result
where field in range_tab.
clear: range_tab[], w_count.
endif.
This should read the data quite efficiently if the field(s) in the where block match index / key fields. Effectively you are breaking 20,000 entries up into sets of approx 500 (assuming 10 char each key this gives 5KB + for the SQL each time - longer fields = less keys).
Other option is to convert list of single entries (I EQ in select-options table) into between groups (I BT). If the 20,000 has a lot of sequential yalues then you can code this easily, but with gaps it gets messy.
Andrew
‎2007 Sep 05 3:44 PM
Hi andrew
Thanks for the code.. this doesnt help me much because i have to finally submit these values to a standard transaction
I was having another idea to split it up into multiple batches of 6K and process them in background mode
convert list of single entries (I EQ in select-options table) into between groups (I BT)
Even I have this thought but I need to write a lot of coding to determine whether they are in sequence or not. Anyways points for you