Re: Ask Jeeves Crawler access to Debian

2004-06-18 Thread Colin Watson
On Fri, Jun 18, 2004 at 05:31:32PM +0100, MJ Ray wrote: > On 2004-06-18 15:09:30 +0100 Kaushal Kurapati > <[EMAIL PROTECTED]> wrote: > >On bugs.debian.org, we notice that there is a "disallow" directive in > >your robots.txt that blocks our crawler from accessing pages on your > >site. > > I cann

RE: Ask Jeeves Crawler access to Debian

2004-06-18 Thread MJ Ray
On 2004-06-18 18:01:47 +0100 Kaushal Kurapati <[EMAIL PROTECTED]> wrote: [...] So people come to our site and look for bugs in debian say, then we should be able to direct people to your site content; You could already do this without crawling bugs.debian.org if you wanted, but linking to De

RE: Ask Jeeves Crawler access to Debian

2004-06-18 Thread Kaushal Kurapati
Subject: Re: Ask Jeeves Crawler access to Debian On 2004-06-18 15:09:30 +0100 Kaushal Kurapati <[EMAIL PROTECTED]> wrote: > On bugs.debian.org, we notice that there is a "disallow" directive in > your robots.txt that blocks our crawler from accessing pages on your >

Re: Ask Jeeves Crawler access to Debian

2004-06-18 Thread MJ Ray
On 2004-06-18 15:09:30 +0100 Kaushal Kurapati <[EMAIL PROTECTED]> wrote: On bugs.debian.org, we notice that there is a "disallow" directive in your robots.txt that blocks our crawler from accessing pages on your site. I cannot speak for Debian, but I suspect this is because generating the ht

Ask Jeeves Crawler access to Debian

2004-06-18 Thread Kaushal Kurapati
Hello Debian:   This is Kaushal Kurapati from Ask Jeeves. I am a Senior Search Product Manager here and wanted to speak to you about a crawler blocking issue. On bugs.debian.org, we notice that there is a "disallow" directive in your robots.txt that blocks our crawler from accessing pages