Le 10/05/2023 à 19:38, Alfred M. Szmidt a écrit :
    Because it registers every single commit to www,

What is "it"?  How is this different from _any_ -commits list we have?

    including to working directories that webmasters have disallowed,
    for instance */po/, /server/staging/, */workshop/, /prep/gnumaint/,
    etc.

Ok, and?

    Please see https://www.gnu.org/robots.txt.

And?

You've not explained the actual problem.  What are you trying to
solve?

"it" is the www-commits list, which registers all changes to the www
directory, including to pages that are not published yet. I suspect most
of the other *-commits lists deal with source code repositories, which
are public anyway.

If you let crawlers access changes to disallowed directories, you are
defeating the purpose of robots.txt. What was supposed to be unpublished
is actually published.

Reply via email to