Two possiblities I might try:
1) a view:
create view foo as select * from x order by y limit WHATEVER;
Then, periodically do a physical cleaning of the table.
2) Fake records
Have an extra field, a boolean, called "fake". Create a view which
removes the fakes. Initialize the table with however many fakes you need.
Have a date_inserted field, and simply remove the last entered record for
every insert. Then create a view which filters out the fakes.
Jon
On Wed, 16 Jul 2003, Viorel Dragomir wrote:
>
> ----- Original Message -----
> From: "Nigel J. Andrews" <nandrews@investsystems.co.uk>
> To: "Kirill Ponazdyr" <softlist@codeangels.com>
> Cc: "pg_general" <pgsql-general@postgresql.org>
> Sent: Wednesday, July 16, 2003 7:06 PM
> Subject: Re: [GENERAL] Postgresql "FIFO" Tables, How-To ?
>
>
> > On Wed, 16 Jul 2003, Kirill Ponazdyr wrote:
> >
> > > Hello,
> > >
> > > We are currently working on a project where we need to limit number of
> > > records in a table to a certain number. As soon as the number has been
> > > reached, for each new row the oldest row should be deleted (Kinda FIFO),
> > > thus keeping a total number of rows at predefined number.
> > >
> > > The actual limits would be anywhere from 250k to 10mil rows per table.
> > >
> > > It would be great if this could be achieved by RDBMS engine itself, does
> > > Postgres supports this kind of tables ? And if not, what would be the
> most
> > > elegant soluion to achieve our goal in your oppinion ?
> > >
> >
> > An after insert trigger springs to mind.
> >
>
> I see that the tables are quite big and I think a procedure launched by cron
> at certain time to truncate the tables is a better solution.
> If the server runs well with the trigger than choose to create the trigger
> otherwise...
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 8: explain analyze is your friend
>