Hi fellow Postgressers
I have a relatively small database. The data itself is about 1.5GB. After I did some index value changes over the weekend, I decided that it's time to do a vacuum - the db had grown to 10 GB in size - which was just unrealistic.
at 7:13 this otherwise fine morning, I started the vacuum and was hoping for a run time of maybe 2 hours - max. Well, it is 2:20 PM now and it's still ticking. It has created so far 1411 files in $PGDATA/pg_xlog (which, since last night resides on a different drive from the database). Each of the file is 16 MB in size and if this continues for another 3 hours (or so) I'm running out of disk space.
If I would have done a
dump all data
drop db
create new db
restore schema (create tables/indexes/sequences)
load all data
I would have been done in about 2 hours.
Can somebody explain to me what this vacuum is doing with all these files? I suppose, this transaction logging stuff needs some serious looking into/re-writing.
Is there a way to switch this stuff off all together?
Best regards,
Chris