Обсуждение: JDBC + large objects problem

Поиск
Список
Период
Сортировка

JDBC + large objects problem

От
Patrick Goodwill
Дата:
to all:

I'm trying to use JDBC and BLOBS to store large amounts of text in a
database.  I get a strange error when I try to use the conventional JDBC
interface... it comes out with a SQL Exceptions of:

"InputStream as parameter not supported"

for the code:

        Connection conn = pool.getConnection();
        PreparedStatement pstmt =
        conn.prepareStatement("INSERT INTO t" + book
                      + "_data (author_id, title, text,
                                      type) VALUES ( ?, ?, ?, ?)");
        pstmt.setInt(1, userId);
        pstmt.setString(2, title);
        InputStream textStream = stringToStream(text);
        pstmt.setBinaryStream(3, textStream, text.length());
        pstmt.setInt(4, type);
        pstmt.executeUpdate();
        pstmt.close();


... with some helper functions....

    private InputStream stringToStream(String string) {
    byte[] bytes = string.getBytes();
    ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
    return (InputStream) stream;
    }

    private String      streamToString(InputStream stream) {
    try {
        int length = stream.available();
        byte[] bytes = new byte[length];
        stream.read(bytes);
        return new String(bytes);
    } catch (IOException e) {
        System.out.println("No Stream");
    }
    return null;
    }

with an abbreviated schema of....

>> \d t1_data
                               Table "t1_data"
 Attribute |  Type   |                       Modifier
-----------+---------+-------------------------------------------------------
 data_id   | integer | not null default
                               nextval('t1_data_data_id_seq'::text)
 author_id | integer |
 title     | text    |
 text      | oid     |
 type      | integer |
 time      | time    |
Index: t1_data_pkey


.... using postgresql 7.0 and the newest JDBC driver from retep.org.uk


any ideas?

-Patrick.


Re: JDBC + large objects problem

От
"Peter Mount"
Дата:
Please read the archives. I've lost track of how many times I've recently
answered this question. It states that streams are not supported, but there
are other conventional JDBC methods to access BLOBS.

I'm spending more time writing the same answers than implementing
features... ;-(

Peter

--
Peter T Mount  peter@retep.org.uk, me@petermount.com
Homepage: http://www.retep.org.uk Contact details: http://petermount.com
PostgreSQL JDBC: http://www.retep.org.uk/postgres/
Java PDF Generator: http://www.retep.org.uk/pdf/
----- Original Message -----
From: "Patrick Goodwill" <goodwill@cheese.stanford.edu>
To: <pgsql-general@postgresql.org>
Sent: Friday, August 25, 2000 2:35 PM
Subject: [GENERAL] JDBC + large objects problem


> to all:
>
> I'm trying to use JDBC and BLOBS to store large amounts of text in a
> database.  I get a strange error when I try to use the conventional JDBC
> interface... it comes out with a SQL Exceptions of:
>
> "InputStream as parameter not supported"
>
> for the code:
>
>     Connection conn = pool.getConnection();
>     PreparedStatement pstmt =
> conn.prepareStatement("INSERT INTO t" + book
>       + "_data (author_id, title, text,
>                                       type) VALUES ( ?, ?, ?, ?)");
>     pstmt.setInt(1, userId);
>     pstmt.setString(2, title);
>     InputStream textStream = stringToStream(text);
>     pstmt.setBinaryStream(3, textStream, text.length());
>     pstmt.setInt(4, type);
>     pstmt.executeUpdate();
>     pstmt.close();
>
>
> ... with some helper functions....
>
>     private InputStream stringToStream(String string) {
> byte[] bytes = string.getBytes();
> ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
> return (InputStream) stream;
>     }
>
>     private String      streamToString(InputStream stream) {
> try {
>     int length = stream.available();
>     byte[] bytes = new byte[length];
>     stream.read(bytes);
>     return new String(bytes);
> } catch (IOException e) {
>     System.out.println("No Stream");
> }
> return null;
>     }
>
> with an abbreviated schema of....
>
> >> \d t1_data
>                                Table "t1_data"
>  Attribute |  Type   |                       Modifier
> -----------+---------+----------------------------------------------------
---
>  data_id   | integer | not null default
>                                nextval('t1_data_data_id_seq'::text)
>  author_id | integer |
>  title     | text    |
>  text      | oid     |
>  type      | integer |
>  time      | time    |
> Index: t1_data_pkey
>
>
> .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk
>
>
> any ideas?
>
> -Patrick.
>