performance

Поиск
Список
Период
Сортировка
От Tom Allison
Тема performance
Дата
Msg-id 43F3D0E2.3010200@tacocat.net
обсуждение исходный текст
Список pgsql-novice
Probably not a rare topic...

I made a little application last night to run a test:

Given a table of few fields (3)
Insert 10 million rows minimum, not a problem.
Randomly update rows, individually, as quickly as a perl script will allow for a
total number of changes up to however many I can up to 1 billion transactions.
Each update is committed immediately.
Disk is a single EIDE 7400 RPM hard drive with EXT3 journaling.
I have 1GB RAM on some kind of 32-bit AMD CPU (2.? GHz)
postgresql version 7.4 (can't upgrade yet...)

I got about 1.25 million transactions and the IO was just clobbered.
So much so I couldn't log in to the machine to kill the client job.

I have no intention on turning this machine into a hard core server, yet.
But can someone identify what might be considered the top 5 things to do in
order to keep the system from falling over.
I'm willing to take some performance hits to allow the system to keep running.
But considering how many parameters there are to fiddle with I'm reluctant to
just start twirling knobs without knowing what I'm doing.

В списке pgsql-novice по дате отправления:

Предыдущее
От: Nils Zierath
Дата:
Сообщение: newst packages for ubuntu
Следующее
От: Frank Bax
Дата:
Сообщение: Re: one table from the db on a separate drive