On Wed, Dec 16, 2009 at 12:48:03AM -0800, Mark Polesky wrote: > This should be trivial to fix. I would do it but I can't > figure out from the sources how `robots.txt' is generated.
Huh, apparently it's not even in the web branch. Sigh. I can see it here: http://lilypond.org/robots.txt and in retrospect it's obvious how to disallow the other versions. > 1) The only valid locations for blank lines are *above* a > "User-agent" line and below the last "Disallow" line in a > single "User-agent" record. Remove all other blank > lines. I didn't know that; interesting! > 2) Individually disallow *all* directories that are > immediately below the /doc/ directory EXCEPT the one for > the current stable release. Ideally this would be > automated by a script. To be fixed in the new website; for now we'll just band-air the old robots.txt > User-agent: * > Disallow: /doc/v1.6/ > Disallow: /doc/v1.8/ > Disallow: /doc/v1.9/ Huh, I didn't reliaze we kept the old unstable directories around; they're not listed on http://lilypond.org/documentation anybody mind if I delete the unstable doc dirs? > There is an alternative, which may be easier to maintain > (and thus safer). Maybe there are reasons that this would > be a bad idea (I don't know), but we could move the current > stable docs into a new subdirectory of /doc/ (like > /doc/current/) and move everything else to another > subdirectory (like /doc/other). Then the robots.txt file > would only need to be: or /doc/archive ? I'll think about it. Once 2.12.3 is out, and if there's no other emergencies, I'll start working on lilypond.org, the ajax searching, etc etc. Stick around; we'll talk much more about this in a week or so. :) Cheers, - Graham _______________________________________________ bug-lilypond mailing list bug-lilypond@gnu.org http://lists.gnu.org/mailman/listinfo/bug-lilypond