Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq

Поиск
Список
Период
Сортировка
От Cory Nemelka
Тема Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq
Дата
Msg-id CAMe5Gn0C5r75WLfh1cmUXYr23bSZ+kaddo6Z8HwY0zsUsz1+UA@mail.gmail.com
обсуждение исходный текст
Ответ на Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq  (Cory Nemelka <cnemelka@gmail.com>)
Ответы Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq  (Geoff Winkless <pgsqladmin@geoff.dj>)
Список pgsql-admin
All I am am doing is iterating through the characters so I know it isn't my code.

--cnemelka

On Fri, Oct 20, 2017 at 9:14 AM, Cory Nemelka <cnemelka@gmail.com> wrote:
Yes, but I should be able to read them much faster.  The psql client can display an 11MB column in a little over a minute, while in C using libpg library, it takes over an hour.  

Anyone have any experience with the same issue that can help me resolve?

--cnemelka

On Thu, Oct 19, 2017 at 5:20 PM, Aldo Sarmiento <aldo@bigpurpledot.com> wrote:
I believe large columns get put into a TOAST table. Max page size is 8k. So you'll have lots of pages per row that need to be joined with a size like that: https://www.postgresql.org/docs/9.5/static/storage-toast.html


On Thu, Oct 19, 2017 at 2:03 PM, Cory Nemelka <cnemelka@gmail.com> wrote:
I have getting very poor performance using libpq to process very large TEXT columns (300MB+).   I suspect it is IO related but can't be sure.

Anyone had experience with same issue that can help me resolve?

--cnemelka



В списке pgsql-admin по дате отправления:

Предыдущее
От: Cory Nemelka
Дата:
Сообщение: Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq
Следующее
От: Geoff Winkless
Дата:
Сообщение: Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq