pá 14. 7. 2023 v 13:13 odesílatel Tom Lane <tgl@sss.pgh.pa.us> napsal:
Pavel Stehule <pavel.stehule@gmail.com> writes: > pá 14. 7. 2023 v 9:36 odesílatel Shaozhong SHI <shishaozhong@gmail.com> > napsal: >> On Fri, 14 Jul 2023 at 08:14, Pavel Stehule <pavel.stehule@gmail.com> >> wrote: >>> It depends what you do. Postgres doesn't allow to allocate bigger blocks >>> than 1GB. Maybe you create too big string or too big value of some other >>> type. But it can be signal of some cache bloating.
> The best way - do it all in one recursive query without any recursive > function. > This issue you can fix only by rewriting your code.
Yeah, but nonetheless this error message is pretty user-unfriendly.
The given example is too incomplete to run as-is, but I guessed that maybe the array_agg() was accumulating too many values, and sure enough it's possible to reproduce:
We can do better than that. The attached patch causes the error to be
regression=# select array_agg(x::text) from generate_series(1,100000000) x; ERROR: array size exceeds the maximum allowed (1073741823)
+1
it is significantly better
Regards
Pavel
I'm not wedded to that wording, but it's an existing translatable string that at least points you in the direction of "my array is too big". (This is also what you get if the eventual array construction overruns the 1G limit, cf construct_md_array().)