Re: Problem retrieving large records (bytea) data from a table

Поиск
Список
Период
Сортировка
От pasman pasmański
Тема Re: Problem retrieving large records (bytea) data from a table
Дата
Msg-id CAOWY8=ZMJO_GQjk-+MsqKecD-e_XBnXHmwaDDboF+4Yd+3JSTA@mail.gmail.com
обсуждение исходный текст
Ответ на Problem retrieving large records (bytea) data from a table  (jtkells@verizon.net)
Ответы Re: Problem retrieving large records (bytea) data from a table  (Achilleas Mantzios <achill@matrix.gatewaynet.com>)
Список pgsql-admin
You may do a backup of this table. Then with ultraedit search your
documents and remove them.

2011/7/5, jtkells@verizon.net <jtkells@verizon.net>:
> I am having a hang condition every time I try to retrieve a large
> records (bytea) data from  a table
> The OS is a 5.11 snv_134 i86pc i386 i86pc Solaris with 4GB memory
> running Postgresql 8.4.3 with a standard postgresql.conf file (nothing
> has been changed)
> I have the following table called doc_table
>       Column  |              Type              |  Modifiers     |
> Storage  | Description
> ------------------------+--------------------------------+---------------------------------------
>  id           | numeric                        | not null    | main |
>  file_n       | character varying(4000)        |             |
> extended |
>  create_date  | timestamp(6) without time zone | not null
>                 default (clock_timestamp())
>                 ::timestamp(0)without time zone              | plain |
>  desc         | character varying(4000)        |             |
> extended |
>  doc_cc       | character varying(120)         | not null    |
> extended |
>  by           | numeric                        | not null    | main |
>  doc_data     | bytea                          |             |
> extended |
>  mime_type_id | character varying(16)          | not null    |
> extended |
>  doc_src      | text                           |             |
> extended |
>  doc_stat     | character varying(512)         | not null
>                default 'ACTIVE'::character varying           |
> extended |
> Indexes:
>    "documents_pk" PRIMARY KEY, btree (document_id)
>
>
> A while ago the some developers inserted several records with a
> document (stored in doc_Data) that was around 400 - 450 MB each. Now
> when you do a select * (all) from this table you get a hang and the
> system becomes unresponsive.  Prior to these inserts, a select * (all,
> no where clause) worked.  I'm also told a select * from doc_table
> where id = xxx still works.  I haven't seen any error message in the
> postgresql log files.
> So I'm not sure how to find these bad records and why I am getting a
> hang.  Since this postgresql is running with the default config files
> could I be running out of a resource?  If so I'm not sure how to or
> how much to add to these resources to fix this problem since I have
> very little memory on this system.  Does anyone have any ideas why I
> am getting a hang.  Thanks
>
> --
> Sent via pgsql-admin mailing list (pgsql-admin@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-admin
>


--
------------
pasman

В списке pgsql-admin по дате отправления:

Предыдущее
От: Cédric Villemain
Дата:
Сообщение: Re: 9.0.4 Data corruption issue
Следующее
От: Thomas Kellerer
Дата:
Сообщение: Re: Problem retrieving large records (bytea) data from a table