On Tue, Jul 9, 2013 at 5:30 PM, Andres Freund <and...@2ndquadrant.com> wrote: > On 2013-07-09 16:24:42 +0100, Greg Stark wrote: >> I note that git.postgresql.org's robot.txt refuses permission to crawl >> the git repository: >> >> http://git.postgresql.org/robots.txt >> >> User-agent: * >> Disallow: / >> >> >> I'm curious what motivates this. It's certainly useful to be able to >> search for commits. > > Gitweb is horribly slow. I don't think anybody with a bigger git repo > using gitweb can afford to let all the crawlers go through it.
Yes, this is the reason it's been blocked. That machine basically died every time google or bing or baidu or those hit it. Giving horrible response times and timeouts for actual users. We might be able to do something better aobut that now taht we can do better rate limiting, but it's like playing whack-a-mole. The basic software is just fantastically slow. -- Magnus Hagander Me: http://www.hagander.net/ Work: http://www.redpill-linpro.com/ -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers