Re: Inserting streamed data

Поиск
Список
Период
Сортировка
От Doug McNaught
Тема Re: Inserting streamed data
Дата
Msg-id m34rb2bhdo.fsf@varsoon.wireboard.com
обсуждение исходный текст
Ответ на Inserting streamed data  (Kevin Old <kold@carolina.rr.com>)
Список pgsql-general
Kevin Old <kold@carolina.rr.com> writes:

> I have data that is streamed to my server and stored in a text file.  I
> need to get that data into my database as fast as possible.  There are
> approximately 160,000 rows in this text file.  I understand I can use
> the COPY command to insert large chunks of data from a text file, but I
> can't use it in this situation.  Each record in the text file has 502
> "fields".  I pull out 50 of those.  I haven't found a way to manipulate
> the COPY command to pull out the values I need.  So that solution would
> be out.
>
> I have a perl script that goes through the file and pulls out the 50
> fields, then inserts them into the database, but it seems to be very
> slow.  I think I just need some minor performance tuning, but dont' know
> which variables to set in the postgresql.conf file that would help with
> the speed of the inserts.

First: are you batching up multiple INSERTS in a transaction?  If you
don't it will be very slow indeed.

Second, why not have the Perl script pull out the fields you want,
paste them together and feed them to COPY?  That should eliminate the
parse overhead of multiple INSERTS.

-Doug

В списке pgsql-general по дате отправления:

Предыдущее
От: Chris Gamache
Дата:
Сообщение: Creating a unique identifier...
Следующее
От: "David Blood"
Дата:
Сообщение: Re: Inserting streamed data