on 2022 Dec 07 11:39 AM
Can anyone suggest a clever way to do something like this without resorting to dynamic SQL? Please excuse me if I'm just missing something simple.
begin declare s_id_lst char (30); -- set s_id_lst = '3,4'; -- select list(foo_nm) from my_table where foo_id in (s_id_lst) ; end;
Here's an approach using an ARRAY to hold a variable number of IN list elements. Instead of building an IN clause and using it within a WHERE clause, the UNNEST operator is used to generate a derived table from the array, which then is joined with the real table. (And here sa_rowgenerator() is used to mimic a real table.
begin -- use an array for the desired IN-list values declare s_id_arr array of int; set s_id_arr = array(3, 4, 5, 8, 10, 98); select list(foo_num order by foo_num) from -- dummy table via sa_rowgenerator() (select row_num as foo_num from sa_rowgenerator(1, 100)) SP inner join unnest(s_id_arr) as UN(ID) on SP.foo_num = UN.ID; end;
returns 3,4,5,8,10,98
Now the next question is: How can you set the array contents outside this SQL block?
At least within DBISQL, it's not that difficult, say, via a stored procedure:
create or replace procedure SP_FilterViaArray(s_id_arr array of int) result (list_of_ids long varchar) begin select list(foo_num order by foo_num) from -- dummy table via sa_rowgenerator() (select row_num as foo_num from sa_rowgenerator(1, 100)) SP inner join unnest(s_id_arr) as UN(ID) on SP.foo_num = UN.ID; end; -- Use an ARRAY constructor as argument call SP_FilterViaArray(array(9, 8, 7)); select * from SP_FilterViaArray(array(1, 2, 3)); select * from SP_FilterViaArray(array('3', '4', '5', '8', '100', 101)); -- or even build the argument dynamically via another sa_rowgenerator() call call SP_FilterViaArray(array(select row_num from sa_rowgenerator(10, 21, 3)));
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Wow - Thanks, Volker. At least I don't have to feel sheepish: Simple it wasn't! I've not used arrays before in SQL Anywhere, and had not come across unnest(...) and sa_rowgenerator(...) either. New capabilities for me to consider.
One potential limitation of this solution is you have to put an upper limit on the 'id' values - 100, in your example code. Indeed your last usage example of SP_FilterViaArray(...) does not return '101' because it is beyond the assumed limit.
Do you have an opinion on the performance of this solution versus just going ahead with a simpler dynamic sql select?
The upper limit 100 here is just an effect of the "base" table, which I constructed to contain the numbers from 1 to 100, so that's just an limit of the sample. If you use your original table instead, apparently it would only be limited by the datatype of its column and/or the array base type. The array itself is limited to 6.4 million elements according to the docs...
I can't tell on the actual performance but turning lists into derived tables and joining those is something we do use a lot and have never felt negative performance implications. In contrast, I think joins usually are faster than tests via long IN lists, and the SQL Anywhere query engine might do comparable steps during query optimization.
Certainly very long IN lists seem to be bad news. It's a very cunning approach Volker!
How about creating a procedure with the arguments as comma separated long nvarchar and use that in your select statement:
create or replace procedure List2Table( in ValueList long nvarchar ) begin select item from openstring(value ValueList) with("item" nchar(40)) option (delimited by ';' row delimited by ',') as "items" end;
Then use this procedure in your query:
select list(foo_nm) from my_table where foo_id in (select item from List2Table(s_id_lst))
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you. I was also not aware of openstring(...). It looks like you are using openstring(...) to the same end as sa_split_list(...) in the prior answer. I would say sa_split_list(...) is a bit simpler, but this definitely works.
In addition to my ARRAY approach, here's a "less cunning" version with a simple comma-separated list of values as a string, using the builtin sa_split_list() procedure to turn that into a derived table. This should work with SQL Anywhere 10 and above:
create or replace procedure SP_FilterViaList(s_id_list long varchar) result (list_of_ids long varchar) begin select list(foo_num order by foo_num) from -- dummy table via sa_rowgenerator() (select row_num as foo_num from sa_rowgenerator(1, 100)) MyDummyTable inner join sa_split_list(s_id_list, ',') SP on MyDummyTable.foo_num = SP.row_value; end; -- Use a comma-separated list of values call SP_FilterViaList('9, 8, 7'); select * from SP_FilterViaList('1, 2, 3');
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks, Volker. I really like this one - it is much simpler. I was also not aware of sa_split_list(...). For the record, the following is a working solution to my original question example - and it does not require sa_rowgenerator() with an estimate of a maximum number of rows to generate:
begin
declare s_id_lst char (30);
--
set s_id_lst = '3,4';
--
select
list(mt.foo_nm)
from
sa_split_list(s_id_lst, ',') sp join my_table mt
on (sp.row_value = mt.foo_id)
;
end;
and it does not require sa_rowgenerator() with an estimate of a maximum number of rows to generate
It's a misunderstanding – I just used sa_rowgenerator to generate a dummy table so the variable list of values could be tested against something. I could just as well had a regular table with some values in my sample, and of course, you would use your sample table.
That being said, I also think this is easier than the ARRAY approach. I very freqently use sa_split_list() but rarely use arrays, so the ARRAY approach was more fun on my part... 🙂
My method is a bit different but I use it a lot, often building lists from user queries/input. Works well and allows joins on multiple fields rather than just 1. The index speeds things up if the input list is big.
declare local temporary table "zt_tmp"( "myid" integer not null default-1, ) not transactional;
create index if not exists zt_tmp_id on zt_tmp(myid asc);
-- insert your values into the tmp table
select * from my_table as MT join "zt_tmp" as Z on MT.foo_id = Z.myid;
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Another way to achieve this is by using execute immediate
begin declare s_id_lst char (30); -- set s_id_lst = '3,4'; -- execute immediate 'select '|| 'list(foo_nm) '|| 'from '|| 'my_table '|| 'where '|| 'foo_id in ('|| s_id_lst ||')'; end;
You could also create a long varchar variable for constructing the sql statement and then use that variable to execute the statement
execute immediate variable_Statement;
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you, but my question did say, "without resorting to dynamic SQL".
User | Count |
---|---|
69 | |
9 | |
8 | |
7 | |
7 | |
6 | |
5 | |
4 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.