Tuning Postgres for single user manipulating large amounts of data

Поиск
Список
Период
Сортировка
От Paul Taylor
Тема Tuning Postgres for single user manipulating large amounts of data
Дата
Msg-id 4D00B9F6.2060409@fastmail.fm
обсуждение исходный текст
Ответы Re: Tuning Postgres for single user manipulating large amounts of data  (tv@fuzzy.cz)
Список pgsql-general
Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
I using the database with just one db connection to build a lucene
search index from some of the data, and Im trying to improve
performance. The key thing is that I'm only a single user but
manipulating large amounts of data , i.e processing tables with upto 10
million rows in them, so I think want to configure Postgres so that it
can create large temporary tables in memory

Ive tried changes various paramters such as shared_buffers, work_mem and
checkpoint_segments but I don't really understand what they values are,
and the documentation seems to be aimed towards configuring for multiple
users, and my changes make things worse. For example my machine has 2GB
of memory and I read if using as a dedicated server you should set
shared memory to 40% of total memory, but when I increase to more than
30MB Postgres will not start complaining about my SHMMAX limit.

Paul

В списке pgsql-general по дате отправления:

Предыдущее
От: Vick Khera
Дата:
Сообщение: Re: SELECT is immediate but the UPDATE takes forever
Следующее
От: Maxim Boguk
Дата:
Сообщение: Quite a fast lockless vacuum full implemenation