Обсуждение: Bulk data insertion

Поиск
Список
Период
Сортировка

Bulk data insertion

От
Jonathan Daugherty
Дата:
Hello,

I have a PL/PgSQL function that I need to call with some ARRAY
parameters.  These array values are very large -- typically thousands of
elements.  Each element is a 4-element array.  This function is called
to do some sanity checking on the array data and use the individual
elements to do inserts where appropriate.

The problem is that I don't want to spend a lot of time and memory
building such a query (in C).  I would like to know if there is a way to
take this huge chunk of data and get it into the database in a less
memory-intensive way.  I suppose I could use COPY to put the data into a
table with triggers that would do the checks on the data, but it seems
inelegant and I'd like to know if there's a better way.

Thoughts?  Thanks for your time.

--
   Jonathan Daugherty
   Command Prompt, Inc. - http://www.commandprompt.com/
   PostgreSQL Replication & Support Services, (503) 667-4564

Re: Bulk data insertion

От
Tom Lane
Дата:
Jonathan Daugherty <jdaugherty@commandprompt.com> writes:
> The problem is that I don't want to spend a lot of time and memory
> building such a query (in C).  I would like to know if there is a way to
> take this huge chunk of data and get it into the database in a less
> memory-intensive way.  I suppose I could use COPY to put the data into a
> table with triggers that would do the checks on the data, but it seems
> inelegant and I'd like to know if there's a better way.

Actually I'd say that is the elegant way.  SQL is fundamentally a
set-oriented (table-oriented) language, and forcing it to do things in
an array fashion is just misusing the tool.

            regards, tom lane