Re: Scanning a large binary field

Поиск
Список
Период
Сортировка
От Kynn Jones
Тема Re: Scanning a large binary field
Дата
Msg-id c2350ba40903151420v7a31368cxd2b8afefa46df7d3@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Scanning a large binary field  (John R Pierce <pierce@hogranch.com>)
Ответы Re: Scanning a large binary field  (John R Pierce <pierce@hogranch.com>)
Список pgsql-general


On Sun, Mar 15, 2009 at 5:06 PM, John R Pierce <pierce@hogranch.com> wrote:
Kynn Jones wrote:
I have a C program that reads a large binary file, and uses the read information plus some user-supplied arguments to generate an in-memory data structure that is used during the remainder of the program's execution.  I would like to adapt this code so that it gets the original binary data from a Pg database rather than a file.

One very nice feature of the original scheme is that the reading of the original file was done piecemeal, so that the full content of the file (which is about 0.2GB) was never in memory all at once, which kept the program's memory footprint nice and small.

Is there any way to replicate this small memory footprint if the program reads the binary data from a Pg DB instead of from a file?

is this binary data in any way record or table structured such that it could be stored as multiple rows and perrhaps fields?    if not, why would you want to put a 200MB blob of amorphous data into a relational database?

That's a fair question.  The program in question already gets from the relational database most of the external data it needs.  The only exception to this is these large amorphous blobs, as you describe them.  My only reason for wanting to put the blobs in the DB as well is to consolidate all the external data sources for the program.

Kynn

В списке pgsql-general по дате отправления:

Предыдущее
От: John R Pierce
Дата:
Сообщение: Re: Scanning a large binary field
Следующее
От: John R Pierce
Дата:
Сообщение: Re: Scanning a large binary field