Performance issues when the number of records are around 10 Million

Поиск
Список
Период
Сортировка
От venu madhav
Тема Performance issues when the number of records are around 10 Million
Дата
Msg-id c7bd39d5-6f71-449b-a97b-2b8d7f9a7171@g5g2000pre.googlegroups.com
обсуждение исходный текст
Ответы Re: Performance issues when the number of records are around 10 Million  (Sergey Konoplev <gray.ru@gmail.com>)
Re: Performance issues when the number of records are around 10 Million  ("A. Kretschmer" <andreas.kretschmer@schollglas.com>)
Список pgsql-general
Hi all,
       In my database application, I've a table whose records can
reach 10M and insertions can happen at a faster rate like 100
insertions per second in the peak times. I configured postgres to do
auto vacuum on hourly basis. I have frontend GUI application in CGI
which displays the data from the database.
        When I try to get the last twenty records from the database,
it takes around 10-15 mins to complete the operation.This is the query
which is used:

select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name,
e.sniff_ip, e.sniff_channel, s.sig_config, e.wifi_addr_1,
e.wifi_addr_2, e.view_status, bssid  FROM event e, signature s WHERE
s.sig_id = e.signature   AND e.timestamp >= '1270449180' AND
e.timestamp < '1273473180'  ORDER BY e.cid DESC,  e.cid DESC limit 21
offset 10539780;

Can any one suggest me a better solution to improve the performance.
Please let me know if you've any further queries.


Thank you,
Venu

В списке pgsql-general по дате отправления:

Предыдущее
От: Yan Cheng CHEOK
Дата:
Сообщение: Run Vacuum Through JDBC
Следующее
От: Sergey Konoplev
Дата:
Сообщение: Re: Performance issues when the number of records are around 10 Million