Обсуждение: Loading lots of data in a SQL command

Поиск
Список
Период
Сортировка

Loading lots of data in a SQL command

От
frank church
Дата:
I am load lots of data via SQL into a database and wrapping it into transactions
speeds it up.

However this fails a number of times. The queries results are logged so it is
easy for me to find problem records.

However a single failure causes the whole transaction to fail.

Is there a setting or feature that allows which allows the same performance as
transactions, without causing the whole process to fail, like a delayed updates
or write mechanism of some sort.

It is something I would like to set in that particular data looad.


Frank



----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.



Re: Loading lots of data in a SQL command

От
Richard Huxton
Дата:
frank church wrote:
> I am load lots of data via SQL into a database and wrapping it into transactions
> speeds it up.
> 
> However this fails a number of times. The queries results are logged so it is
> easy for me to find problem records.
> 
> However a single failure causes the whole transaction to fail.>
> Is there a setting or feature that allows which allows the same performance as
> transactions, without causing the whole process to fail, like a delayed updates
> or write mechanism of some sort.

Not as it stands. I tend to use a small perl wrapper myself that loads 
in batches of e.g. 10000 rows and if there is an error deal with it 
separately.

I seem to recall it being discussed as a built-in feature recently 
though, so there might be someone working on it for a future version.

> It is something I would like to set in that particular data looad.

You might find the "pgloader" project meets your needs exactly:  http://pgfoundry.org/projects/pgloader/

--   Richard Huxton  Archonet Ltd