Tuning Postgres for single user manipulating large amounts of data

Поиск
Список
Период
Сортировка
От Paul Taylor
Тема Tuning Postgres for single user manipulating large amounts of data
Дата
Msg-id 4D00CAD4.6090307@fastmail.fm
обсуждение исходный текст
Ответы Re: Tuning Postgres for single user manipulating large amounts of data  (Andy Colson <andy@squeakycode.net>)
Re: Tuning Postgres for single user manipulating large amounts of data  (Scott Marlowe <scott.marlowe@gmail.com>)
Список pgsql-general
Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
I using the database with just one db connection to build a lucene
search index from some of the data, and Im trying to improve
performance. The key thing is that I'm only a single user but
manipulating large amounts of data , i.e processing tables with upto 10
million rows in them, so I think want to configure Postgres so that it
can create large temporary tables in memory

I've tried changes various parameters such as shared_buffers, work_mem
and checkpoint_segments but I don't really understand what they values
are, and the documentation seems to be aimed towards configuring for
multiple users, and my changes make things worse. For example my machine
has 2GB of memory and I read if using as a dedicated server you should
set shared memory to 40% of total memory, but when I increase to more
than 30MB Postgres will not start complaining about my SHMMAX limit.

Paul

В списке pgsql-general по дате отправления:

Предыдущее
От: Pavel Stehule
Дата:
Сообщение: Re: Which query is good - IN or OR
Следующее
От: "Rob Richardson"
Дата:
Сообщение: How can I create a PgAgent job creation script?