Обсуждение: Concurrent read and write access of LargeObject via getBlob() can raise exception

Поиск
Список
Период
Сортировка

Concurrent read and write access of LargeObject via getBlob() can raise exception

От
Balázs Zsoldos
Дата:
Hi,

I created a table with the following fields:
  • blob_id: bigint / primary key, auto increment
  • blob: oid / a pointer to a large object
I created a trigger that unlinks the largeobject if a record is deleted from this table.

If I
  • select a record from my table and get the ResultSet instance
  • parallel, I delete the blob within another transaction
  • call resultSet.getBlob(1).getBinaryStream();
I get the following exception:

Caused by: org.postgresql.util.PSQLException: ERROR: large object 97664 does not exist
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
        at org.postgresql.core.v3.QueryExecutorImpl.receiveFastpathResult(QueryExecutorImpl.java:672)
        at org.postgresql.core.v3.QueryExecutorImpl.fastpathCall(QueryExecutorImpl.java:501)
        at org.postgresql.fastpath.Fastpath.fastpath(Fastpath.java:109)
        at org.postgresql.fastpath.Fastpath.fastpath(Fastpath.java:156)
        at org.postgresql.fastpath.Fastpath.getInteger(Fastpath.java:168)
        at org.postgresql.largeobject.LargeObject.<init>(LargeObject.java:106)
        at org.postgresql.largeobject.LargeObject.<init>(LargeObject.java:123)
        at org.postgresql.largeobject.LargeObject.copy(LargeObject.java:128)
        at org.postgresql.jdbc4.AbstractJdbc4Blob.getBinaryStream(AbstractJdbc4Blob.java:26)
        at org.everit.blobstore.jdbc.internal.StreamBlobChannel.read(StreamBlobChannel.java:97)
        at org.everit.blobstore.jdbc.internal.JdbcBlobReader.read(JdbcBlobReader.java:128)

For me that means that it is impossible to be sure that between getting a record from my table and getting the actual content of the blob, the content will be still the same as when I selected the blob record.

I guess I can use safely the table only if I select the record with FOR SHARE.

Regards,
Balázs Zsoldos

Re: Concurrent read and write access of LargeObject via getBlob() can raise exception

От
Albe Laurenz
Дата:
Balázs Zsoldos wrote:
> I created a table with the following fields:
> 
> *    blob_id: bigint / primary key, auto increment
> *    blob: oid / a pointer to a large object
> 
> I created a trigger that unlinks the largeobject if a record is deleted from this table.
> 
> If I
> 
> *    select a record from my table and get the ResultSet instance
> *    parallel, I delete the blob within another transaction
> *    call resultSet.getBlob(1).getBinaryStream();
> 
> I get the following exception:
> 
> Caused by: org.postgresql.util.PSQLException: ERROR: large object 97664 does not exist
>         at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
>         at org.postgresql.core.v3.QueryExecutorImpl.receiveFastpathResult(QueryExecutorImpl.java:672)
>         at org.postgresql.core.v3.QueryExecutorImpl.fastpathCall(QueryExecutorImpl.java:501)
>         at org.postgresql.fastpath.Fastpath.fastpath(Fastpath.java:109)
>         at org.postgresql.fastpath.Fastpath.fastpath(Fastpath.java:156)
>         at org.postgresql.fastpath.Fastpath.getInteger(Fastpath.java:168)
>         at org.postgresql.largeobject.LargeObject.<init>(LargeObject.java:106)
>         at org.postgresql.largeobject.LargeObject.<init>(LargeObject.java:123)
>         at org.postgresql.largeobject.LargeObject.copy(LargeObject.java:128)
>         at org.postgresql.jdbc4.AbstractJdbc4Blob.getBinaryStream(AbstractJdbc4Blob.java:26)
>         at org.everit.blobstore.jdbc.internal.StreamBlobChannel.read(StreamBlobChannel.java:97)
>         at org.everit.blobstore.jdbc.internal.JdbcBlobReader.read(JdbcBlobReader.java:128)
> 
> For me that means that it is impossible to be sure that between getting a record from my table and
> getting the actual content of the blob, the content will be still the same as when I selected the blob
> record.
> 
> I guess I can use safely the table only if I select the record with FOR SHARE.

Another, maybe better, option would be to start a transaction with isolation level
REPEATABLE READ and do your SELECT within that transaction.

That way you would not block others, but the large object would still be visible
for you even if a later transaction deleted it.

Yours,
Laurenz Albe