Package: www.debian.org
Severity: minor

The wiki returns a "200 OK" response for every page, even ones that
do not exist yet. It would be better if it returned a 404 so these
pages do not get crawled or indexed.

If it's a problem to accomplish that, then the page should at least
use a <meta name="robots" value="noindex,nofollow"> so complying robots
will skip it.

One specific URL affected by this is wiki.debian.org/robots.txt: this
now returns a lot of content not at all in line with the robots.txt
specification. It should return 404 or robots.txt content.

By the way, thanks for the very useful service!


Thijs


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to