reducing IO and memory usage: sending the content of a table to multiple files

Поиск
Список
Период
Сортировка
От Ivan Sergio Borgonovo
Тема reducing IO and memory usage: sending the content of a table to multiple files
Дата
Msg-id 20090402112002.1e7b5e43@dawn.webthatworks.it
обсуждение исходный текст
Ответы Re: reducing IO and memory usage: sending the content of a table to multiple files  (Sam Mason <sam@samason.me.uk>)
Список pgsql-general
This is the work-flow I've in mind:

1a) take out *all* data from a table in chunks (M record for each
file, one big file?) (\copy??, from inside a scripting language?)

2a) process each file with awk to produce N files very similar each
other (substantially turn them into very simple xml)
3a) gzip them

2b) use any scripting language to process and gzip them avoiding a
bit of disk IO

Does PostgreSQL offer me any contrib, module, technique... to save
some IO (and maybe disk space for temporary results?).

Are there any memory usage implication if I'm doing a:
pg_query("select a,b,c from verylargetable; --no where clause");
vs.
the \copy equivalent
any way to avoid them?

thanks

--
Ivan Sergio Borgonovo
http://www.webthatworks.it


В списке pgsql-general по дате отправления:

Предыдущее
От: Craig Ringer
Дата:
Сообщение: Re: possible small contribution to the PostgreSQL manual? Example for two-phase commit section.
Следующее
От: linnewbie
Дата:
Сообщение: Posgres Adding braces at beginning and end of text (html) content