Re: Trading off large objects (arrays, large strings, large tables) for timeseries

Поиск
Список
Период
Сортировка
От Antonios Christofides
Тема Re: Trading off large objects (arrays, large strings, large tables) for timeseries
Дата
Msg-id 20050216090400.GA3131@itia.ntua.gr
обсуждение исходный текст
Ответ на Re: Trading off large objects (arrays, large strings, large tables) for timeseries  (Tom Lane <tgl@sss.pgh.pa.us>)
Ответы Re: Trading off large objects (arrays, large strings, large tables) for timeseries
Список pgsql-general
Tom Lane wrote:
> Antonios Christofides <anthony@itia.ntua.gr> writes:
> >     Why 25 seconds for appending an element?
>
> Would you give us a specific test case, rather than a vague description
> of what you're doing?

OK, sorry, here it is (on another machine, thus times are different.
8.0.1 on a PIV 1.6GHz 512 MB RAM, Debian woody, kernel 2.4.18):

CREATE TABLE test(id integer not null primary key, records text[]);

INSERT INTO test(id, records) VALUES (1,
'{"1993-09-30 13:20,182,",
"1993-09-30 13:30,208,",
"1993-09-30 13:51,203,",
[snipping around 2 million rows]
"2057-02-13 02:31,155,",
"2099-12-08 10:39,198,"}');

[Took 60 seconds]

SELECT array_dims(records) FROM test;
 array_dims
-------------
 [1:2000006]
(1 row)

UPDATE test SET records[2000007] = 'hello, world!';

[11 seconds]

UPDATE test SET records[1000000] = 'hello, world!';

[15 seconds (but the difference may be because of system load - I
don't have a completely idle machine available right now)]

I thought the two above UPDATE commands would be instant.

В списке pgsql-general по дате отправления:

Предыдущее
От: "Ed L."
Дата:
Сообщение: hung postmaster?
Следующее
От: Pritesh Shah
Дата:
Сообщение: postgresql8.0 and postgis1.0.0