Re: INSERTing lots of data

Поиск
Список
Период
Сортировка
От Craig Ringer
Тема Re: INSERTing lots of data
Дата
Msg-id 4BFFBD4B.3070708@postnewspapers.com.au
обсуждение исходный текст
Ответ на INSERTing lots of data  (Joachim Worringen <joachim.worringen@iathh.de>)
Ответы Re: INSERTing lots of data  (Joachim Worringen <joachim.worringen@iathh.de>)
Список pgsql-general
On 28/05/10 17:41, Joachim Worringen wrote:
> Greetings,
>
> my Python application (http://perfbase.tigris.org) repeatedly needs to
> insert lots of data into an exsting, non-empty, potentially large table.
> Currently, the bottleneck is with the Python application, so I intend to
> multi-thread it.

That may not be a great idea. For why, search for "Global Interpreter
Lock" (GIL).

It might help if Python's mostly blocked on network I/O, as the GIL is
released when Python blocks on the network, but still, your results may
not be great.

> will I get a speedup? Or will table-locking serialize things on the
> server side?

Concurrent inserts work *great* with PostgreSQL, it's Python I'd be
worried about.


--
Craig Ringer

В списке pgsql-general по дате отправления:

Предыдущее
От: Giles Lean
Дата:
Сообщение: Re: hi, trying to compile postgres 8.3.11
Следующее
От: Tom Wilcox
Дата:
Сообщение: Out of Memory and Configuration Problems (Big Computer)