Re: extreme memory use when loading in a lot of data

Поиск
Список
Период
Сортировка
От Stephan Szabo
Тема Re: extreme memory use when loading in a lot of data
Дата
Msg-id 20040521144734.V56577@megazone.bigpanda.com
обсуждение исходный текст
Ответ на extreme memory use when loading in a lot of data  (Vivek Khera <khera@kcilink.com>)
Ответы Re: extreme memory use when loading in a lot of data  (Vivek Khera <khera@kcilink.com>)
Список pgsql-general
On Fri, 21 May 2004, Vivek Khera wrote:

> I have some historic data that I want to analyze.  To do this I set up
> postgres on a spare box I picked up for cheap, which just lucked into
> having tons of RAM (1.5G).   I set up postgres to use 10000 buffers,
> and recompiled the kernel to allow 2Gb data size limit per process.
>
> Since this is historical data, I'm actually merging a couple of dumps
> that span the time range.  I've dealt with eliminating any conflicting
> data (ie, clashing unique keys) but I'm not 100% sure that the foreign
> key constraints are all met.  Thus, when loading the data from the
> second dump, I am leaving the FK triggers on.

I'd suggest dropping the constraints, adding the data and adding the
constraint again. If you're using 7.4 the speed will be better for
checking the constraint, and if the constraint is not satisfied, you'll
need to remove the offending row and recreate the constraint, but that's
better than having to reimport.

> Now, this is where my trouble has begun... On importing row 29,796,801
> for the first big table, I get this (after 27 hours!):

I'd wonder if some large portion of the memory is the deferred trigger
queue which doesn't yet spill over to disk when it gets too large.

В списке pgsql-general по дате отправления:

Предыдущее
От: "Carl E. McMillin"
Дата:
Сообщение: Re: Am I locking more than I need to?
Следующее
От: Tom Lane
Дата:
Сообщение: Re: extreme memory use when loading in a lot of data