Optimizing tuning and table design for large analytics DB
| От | Rob W |
|---|---|
| Тема | Optimizing tuning and table design for large analytics DB |
| Дата | |
| Msg-id | 952321.37402.qm@web34607.mail.mud.yahoo.com обсуждение исходный текст |
| Ответы |
Re: Optimizing tuning and table design for large analytics
DB
|
| Список | pgsql-general |
Can anyone point me towards good articles or books that would help a PostgreSQL novice (i.e. me) learn the optimal approachesto setting up a DB for analytics? In this particular case, I need to efficiently analyze approximately 300 million system log events (i.e. time series data).It's log data, so it's only appended to the table, not inserted and is never modified. Only 90 days worth of data willbe retained, so old records need to be deleted periodically. Query performance will only be important for small subsetsof the data (e.g. when analyzing a week or day's worth of data), the rest of the reports will be run in batch mode.There will likely only be one user at a time doing ad-hoc queries. This is a a follow-up to the earlier suggestions that PostgreSQL will handle the volumes of data I plan to work with, soI figured I'd give it a shot. Rob
В списке pgsql-general по дате отправления: