I was looking at our analytic data, and saw that the vast majority of inbound traffic to the docs, hits the 9.1 version. We've known this has been an issue for years and have tried various remedies, clearly none of which are working.
Should we try an experiment for a couple of months, in which we simply block anything that matches \/docs\/((\d+)|(\d.\d))\/ in robots.txt? It's a much more drastic option, but at least it might force Google into indexing the latest doc version with the highest priority.