Re: robots.txt on git.postgresql.org

Поиск
Список
Период
Сортировка
От Greg Stark
Тема Re: robots.txt on git.postgresql.org
Дата
Msg-id CAM-w4HPdUbND-qA8ho1EB-wvj+tXcX=0H_6JtQNbkd_UZsDmHw@mail.gmail.com
обсуждение исходный текст
Ответ на Re: robots.txt on git.postgresql.org  (Magnus Hagander <magnus@hagander.net>)
Ответы Re: robots.txt on git.postgresql.org  (Andres Freund <andres@2ndquadrant.com>)
Re: robots.txt on git.postgresql.org  (Magnus Hagander <magnus@hagander.net>)
Список pgsql-hackers
On Wed, Jul 10, 2013 at 9:36 AM, Magnus Hagander <magnus@hagander.net> wrote:
> We already run this, that's what we did to make it survive at all. The
> problem is there are so many thousands of different URLs you can get
> to on that site, and google indexes them all by default.

There's also https://support.google.com/webmasters/answer/48620?hl=en
which lets us control how fast the Google crawler crawls. I think it's
adaptive though so if the pages are slow it should be crawling slowly


-- 
greg



В списке pgsql-hackers по дате отправления:

Предыдущее
От: KONDO Mitsumasa
Дата:
Сообщение: Re: Improvement of checkpoint IO scheduler for stable transaction responses
Следующее
От: Andres Freund
Дата:
Сообщение: Re: robots.txt on git.postgresql.org