Bulk data insertion

Поиск
Список
Период
Сортировка
От Jonathan Daugherty
Тема Bulk data insertion
Дата
Msg-id 41A7AECE.5020706@commandprompt.com
обсуждение исходный текст
Ответы Re: Bulk data insertion
Список pgsql-general
Hello,

I have a PL/PgSQL function that I need to call with some ARRAY
parameters.  These array values are very large -- typically thousands of
elements.  Each element is a 4-element array.  This function is called
to do some sanity checking on the array data and use the individual
elements to do inserts where appropriate.

The problem is that I don't want to spend a lot of time and memory
building such a query (in C).  I would like to know if there is a way to
take this huge chunk of data and get it into the database in a less
memory-intensive way.  I suppose I could use COPY to put the data into a
table with triggers that would do the checks on the data, but it seems
inelegant and I'd like to know if there's a better way.

Thoughts?  Thanks for your time.

--
   Jonathan Daugherty
   Command Prompt, Inc. - http://www.commandprompt.com/
   PostgreSQL Replication & Support Services, (503) 667-4564

В списке pgsql-general по дате отправления:

Предыдущее
От: Hunter Hillegas
Дата:
Сообщение: Invalid Character Data Problem
Следующее
От: Kenneth Tanzer
Дата:
Сообщение: Re: Regexp matching: bug or operator error?