Using BLOBs with PostgreSQL

Поиск
Список
Период
Сортировка
От Tim Kientzle
Тема Using BLOBs with PostgreSQL
Дата
Msg-id 39DFA932.31834C8D@acm.org
обсуждение исходный текст
Ответы Re: Using BLOBs with PostgreSQL  ("Martin A. Marques" <martin@math.unl.edu.ar>)
Список pgsql-general
I'm evaluating a couple of different databases for use as
the back-end to a web-based publishing system that's currently
being developed in Java and Perl.

I want to keep _all_ of the data in the database, to
simplify future replication and data management.  That
includes such data as GIF images, large HTML files,
even multi-megabyte downloadable software archives.

I've been using MySQL for initial development; it has pretty
clean and easy-to-use BLOB support.  You just declare a BLOB
column type, then read and write arbitrarily large chunks of data.
In Perl, BLOB columns work just like varchar columns; in JDBC,
the getBinaryStream()/setBinaryStream() functions provide support
for streaming large data objects.

How well-supported is this functionality in PostgreSQL?
I did some early experimenting with PG, but couldn't
find any column type that would accept binary data
(apparently PG's parser chokes on null characters?).

I've heard about TOAST, but have no idea what it really
is, how to use it, or how well it performs.  I'm leery
of database-specific APIs.

            - Tim Kientzle

В списке pgsql-general по дате отправления:

Предыдущее
От: Frank Joerdens
Дата:
Сообщение: Re: How does TOAST compare to other databases' mechanisms?
Следующее
От: Collin Peters
Дата:
Сообщение: Postmaster startup problems