PostgreSQL unsuitable for large datasets?

Поиск
Список
Период
Сортировка
От Michael Labhard
Тема PostgreSQL unsuitable for large datasets?
Дата
Msg-id akr31g$7mo$1@main.gmane.org
обсуждение исходный текст
Ответы Re: PostgreSQL unsuitable for large datasets?
Список pgsql-cygwin
    We have a PostgreSQL (7.2.1) database on cygwin with one table that
contains some 2 x 10**6 records.  The database is about 100 MB total.  This
does not seem so large, but while attempting to do an insert into an empty
table using a select that joins to the large table on all the rows the
error,

        Memory exhausted in AllocSetAlloc(36)

occurs.  Reviewing posts on this subject as well as the PostgreSQL FAQ did
not reveal any specific solution.  ulimit -a:

$ ulimit -a
core file size        (blocks, -c) unlimited
data seg size         (kbytes, -d) unlimited
file size             (blocks, -f) unlimited
open files                    (-n) 256
pipe size          (512 bytes, -p) 8
stack size            (kbytes, -s) 2046
cpu time             (seconds, -t) unlimited
max user processes            (-u) 63
virtual memory        (kbytes, -v) 2097152

Virtual memory would appear to be adequate.  Increasing the SORT_MEM
parameter in the postgresql.conf file did not help.  Finally had to
partition the large table into 10 parts and do each individually.

    Seems like this problem would need to have a ready and definitive
solution before PostgreSQL could be used for large, serious databases.  How
large are the databases with which others are using PostgreSQL?  How do
other avoid this error with large datasets?

-- Michael





В списке pgsql-cygwin по дате отправления:

Предыдущее
От: "Michael Labhard"
Дата:
Сообщение: Re: Why sequential scan with index present?
Следующее
От: Jason Tishler
Дата:
Сообщение: Re: PostgreSQL unsuitable for large datasets?