Re: Support allocating memory for large strings

Поиск
Список
Период
Сортировка
От Jose Luis Tallon
Тема Re: Support allocating memory for large strings
Дата
Msg-id cccaf443-ce80-c66e-c52b-e89b0677a7ed@adv-solutions.net
обсуждение исходный текст
Ответ на Support allocating memory for large strings  (Maxim Zibitsker <max.zibitsker@gmail.com>)
Список pgsql-hackers
On 8/11/25 3:15, Maxim Zibitsker wrote:
> PostgreSQL's MaxAllocSize limit prevents storing individual variable-length character strings exceeding ~1GB, causing
"invalidmemory alloc request size" errors during INSERT operations on tables with large text columns. Example
reproductionincluded in artifacts.md.
 

Tom Lane's very appropriate response not withstanding....

a) Why is this a problem? (Please share a bit more about your intended 
use case)

b) Why would someone need to store >1GB worth of TEXT (in a single 
string, no less!) in a column in an (albeit very flexible) Relational 
Database ?

     (I'm assuming no internal structure that would allow such amount of 
text to be split/spread over multiple records)

c) There exists LObs (Large OBjects) intended for this use, precisely... 
why is this mechanism not a good solution to your need?

d) Wouldn't a (journalling) File System (with a slim abstraction layer 
on top for directory hashing/indexing) not be a better solution for this 
particular application?

     Full Text Search on the stored data doesn't look like it would ever 
be performant... there exist specialized tools for that


And... how did you get "invalid" data in the database, that pg_dump 
wouldn't process, in the first place? (maybe just speculating/projecting 
and I didn't pick up the nuance properly)


Mostly curious about the problem / intended use case.... when we 
explored limits and limitations in Postgres almost 15 years ago, we 
never considered this even :o



Thanks,

-- 
Parkinson's Law: Work expands to fill the time alloted to it.




В списке pgsql-hackers по дате отправления: