On Fri, Jun 18, 2004 at 05:31:32PM +0100, MJ Ray wrote:
> On 2004-06-18 15:09:30 +0100 Kaushal Kurapati
> <[EMAIL PROTECTED]> wrote:
> >On bugs.debian.org, we notice that there is a "disallow" directive in
> >your robots.txt that blocks our crawler from accessing pages on your
> >site.
>
> I cann
On 2004-06-18 18:01:47 +0100 Kaushal Kurapati
<[EMAIL PROTECTED]> wrote:
[...] So people come to our site and look for bugs in
debian say, then we should be able to direct people to your site
content;
You could already do this without crawling bugs.debian.org if you
wanted, but linking to De
Subject: Re: Ask Jeeves Crawler access to Debian
On 2004-06-18 15:09:30 +0100 Kaushal Kurapati <[EMAIL PROTECTED]>
wrote:
> On bugs.debian.org, we notice that there is a "disallow" directive in
> your robots.txt that blocks our crawler from accessing pages on your
>
On 2004-06-18 15:09:30 +0100 Kaushal Kurapati
<[EMAIL PROTECTED]> wrote:
On bugs.debian.org, we notice that there is a "disallow" directive in
your robots.txt that blocks our crawler from accessing pages on your
site.
I cannot speak for Debian, but I suspect this is because generating
the ht
Hello Debian:
This
is Kaushal Kurapati from Ask Jeeves. I am a Senior Search Product Manager here
and wanted to speak to you about a crawler blocking issue. On bugs.debian.org,
we notice that there is a "disallow" directive in your
robots.txt that blocks our crawler from accessing pages
5 matches
Mail list logo