I had tried to use a normal table for store stats information, but several acrobatic hacks are needed to keep performance.
I guess it is not really required to synchronize the stats into some physical table immediately. I would suggest keeping all the data in memory, and having a job that periodically dumps snapshots into physical tables (with WAL etc). In that case one would be able to compute database workload as a difference between two given snapshots. From my point of view, it does not look like a performance killer to have snapshots every 15 minutes. It does not look too bad to get the statistics of last 15 minutes lost in case of database crash either.