Hello,
I have a really weird problem; queries against a very large table are failing against a very large table with a strange error. Case in point:
dqfull=# vacuum freeze mytable;
ERROR: could not access status of transaction 538989714
DETAIL: could not open file "/srv/db/postgresql/pg_clog/0202": No such file or directory
WTF? The only activity this table has seen is a massive data import of around ~40M rows. Is there a way to fix clog info and make it think all transactions on it have committed? (Note: I tried a VACUUM FREEZE after other commands were failing) What’s a good strategy to fix this table. I’d prefer to not have to reload it since that will take over 1 day.
Logan Bowers