On Thu, Jul 11, 2013 at 3:43 PM, Greg Stark wrote:
> On Wed, Jul 10, 2013 at 9:36 AM, Magnus Hagander wrote:
>> We already run this, that's what we did to make it survive at all. The
>> problem is there are so many thousands of different URLs you can get
>> to on that site, and google indexes the
On 2013-07-11 14:43:21 +0100, Greg Stark wrote:
> On Wed, Jul 10, 2013 at 9:36 AM, Magnus Hagander wrote:
> > We already run this, that's what we did to make it survive at all. The
> > problem is there are so many thousands of different URLs you can get
> > to on that site, and google indexes them
On Wed, Jul 10, 2013 at 9:36 AM, Magnus Hagander wrote:
> We already run this, that's what we did to make it survive at all. The
> problem is there are so many thousands of different URLs you can get
> to on that site, and google indexes them all by default.
There's also https://support.google.co
On Wed, Jul 10, 2013 at 10:25 AM, Craig Ringer wrote:
> On 07/09/2013 11:30 PM, Andres Freund wrote:
>> On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
>>> I note that git.postgresql.org's robot.txt refuses permission to crawl
>>> the git repository:
>>>
>>> http://git.postgresql.org/robots.txt
>>
On Wed, Jul 10, 2013 at 9:25 AM, Craig Ringer wrote:
> On 07/09/2013 11:30 PM, Andres Freund wrote:
>> On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
>>> I note that git.postgresql.org's robot.txt refuses permission to crawl
>>> the git repository:
>>>
>>> http://git.postgresql.org/robots.txt
>>>
On 07/09/2013 11:30 PM, Andres Freund wrote:
> On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
>> I note that git.postgresql.org's robot.txt refuses permission to crawl
>> the git repository:
>>
>> http://git.postgresql.org/robots.txt
>>
>> User-agent: *
>> Disallow: /
>>
>>
>> I'm curious what mot
Magnus Hagander writes:
> Oh, and we need stable wheezy packages for them, or we'll be paying
> even more in maintenance. AFAICT, there aren't any for cgit, but maybe
> I'm searching for the wrong thing..
Seems to be a loser on that front too.
--
Dimitri Fontaine
http://2ndQuadrant.fr Postgr
Magnus Hagander writes:
> On Tue, Jul 9, 2013 at 5:56 PM, Dimitri Fontaine
> wrote:
>> What's blocking alternatives to be considered? I already did mention
>> cgit, which has the advantage to clearly show the latest patch on all
>> the active branches in its default view, which would match our b
On Tue, Jul 9, 2013 at 5:56 PM, Dimitri Fontaine wrote:
> Andres Freund writes:
>> Gitweb is horribly slow. I don't think anybody with a bigger git repo
>> using gitweb can afford to let all the crawlers go through it.
>
> What's blocking alternatives to be considered? I already did mention
> cgi
On Tue, Jul 9, 2013 at 5:30 PM, Andres Freund wrote:
> On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
>> I note that git.postgresql.org's robot.txt refuses permission to crawl
>> the git repository:
>>
>> http://git.postgresql.org/robots.txt
>>
>> User-agent: *
>> Disallow: /
>>
>>
>> I'm curious
Andres Freund writes:
> Gitweb is horribly slow. I don't think anybody with a bigger git repo
> using gitweb can afford to let all the crawlers go through it.
What's blocking alternatives to be considered? I already did mention
cgit, which has the advantage to clearly show the latest patch on all
On 07/09/2013 11:24 AM, Greg Stark wrote:
I note that git.postgresql.org's robot.txt refuses permission to crawl
the git repository:
http://git.postgresql.org/robots.txt
User-agent: *
Disallow: /
I'm curious what motivates this. It's certainly useful to be able to
search for commits. I frequ
On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
> I note that git.postgresql.org's robot.txt refuses permission to crawl
> the git repository:
>
> http://git.postgresql.org/robots.txt
>
> User-agent: *
> Disallow: /
>
>
> I'm curious what motivates this. It's certainly useful to be able to
> se
I note that git.postgresql.org's robot.txt refuses permission to crawl
the git repository:
http://git.postgresql.org/robots.txt
User-agent: *
Disallow: /
I'm curious what motivates this. It's certainly useful to be able to
search for commits. I frequently type git commit hashes into Google to
f
14 matches
Mail list logo