Reputation: 5921
Here is a lateral query which is part of a bigger query:
lateral (
select
array_agg(
sh.dogsfilters
) filter (
where
sh.dogsfilters is not null
) as dependencyOfFoods
from
shelter sh
where
sh.shelterid = ${shelterid}
) filtersOfAnimals,
the problem is with array_agg
method as it fails when it has arrays with different lengths like this ("[[7, 9], [7, 9, 8], [8]]")
!
The problem is easy to solve using json_agg
but later in the query there's a any
check like this:
...
where
cd.dogsid = any(filtersOfAnimals.dependencyOfFoods)
and
...
...
But as any
will not work on json data which is prepared using json_agg
so I can't use it instead of array_agg
!
What might be a better solution to this?
Upvotes: 0
Views: 1679
Reputation: 44383
In PostgreSQL, you can define your own aggregates. I think that this one does what you want:
create function array_concat_agg_tran(anyarray,anyarray) returns anyarray language sql
as $$ select $1||$2 $$;
create aggregate array_concat_agg(anyarray) (sfunc=array_concat_agg_tran, stype=anyarray);
Then:
select array_concat_agg(x) from (values (ARRAY[1,2]),(ARRAY[3,4,5])) f(x);
array_concat_agg
------------------
{1,2,3,4,5}
With a bit more work, you could make it parallelizable as well.
Upvotes: 1
Reputation: 1271111
Unnest the arrays and re-aggregate:
lateral
(select array_agg(dogfilter) filter (where dogfilter is not null) as dependencyOfFoods
from shelter sh cross join
unnest(sh.dogsfilters) dogfilter
where sh.shelterid = ${shelterid}
) filtersOfAnimals,
It is interesting that Postgres doesn't have a function that does this. BigQuery offers array_concat_agg()
which does exactly what you want.
Upvotes: 1
Reputation: 2177
It is ugly, but it works:
regexp_split_to_array(
array_to_string(
array_agg(
array_to_string(value,',')
),','
),',')::integer[]
I don't know if this could be a valid solution from the the performance point of view ...
Upvotes: 1