Re: Better way to bulk-load millions of CSV records into postgres?

Поиск
Список
Период
Сортировка
От Joel Burton
Тема Re: Better way to bulk-load millions of CSV records into postgres?
Дата
Msg-id JGEPJNMCKODMDHGOBKDNGEPMCOAA.joel@joelburton.com
обсуждение исходный текст
Ответ на Better way to bulk-load millions of CSV records into postgres?  (Ron Johnson <ron.l.johnson@cox.net>)
Список pgsql-novice
> -----Original Message-----
> From: pgsql-novice-owner@postgresql.org
> [mailto:pgsql-novice-owner@postgresql.org]On Behalf Of Ron Johnson
> Sent: Tuesday, May 21, 2002 4:40 PM
> To: PgSQL Novice ML
> Subject: [NOVICE] Better way to bulk-load millions of CSV records into
> postgres?
>
>
> Hi,
>
> Currently, I've got a python script using pyPgSQL that
> parses the CSV record, creates a string that is a big
> "INSERT INTO VALUES (...)" command, then, execute() it.
>
> top shows that this method uses postmaster with ~70% CPU
> utilization, and python with ~15% utilization.
>
> Still, it's only inserting ~190 recs/second.  Is there a
> better way to do this, or am I constrained by the hardware?
>
> Instead of python and postmaster having to do a ton of data
> xfer over sockets, I'm wondering if there's a way to send a
> large number of csv records (4000, for example) in one big
> chunk to a stored procedure and have the engine process it
> all.

You could change your Python script to output a COPY command, which is
*much* faster than individual INSERT commands.

- J.

Joel BURTON | joel@joelburton.com | joelburton.com | aim: wjoelburton
Knowledge Management & Technology Consultant


В списке pgsql-novice по дате отправления:

Предыдущее
От: Victor Manuel Torres Aguirre
Дата:
Сообщение: PostgreSQL+Access97+Linux: How to..
Следующее
От: Ron Johnson
Дата:
Сообщение: Re: PostgreSQL+Access97+Linux: How to..