> In sum, this private playground is something webmasters want and need,
    > and search engines should have no business indexing it. Is it possible?

robots.txt will not stop ill-behaved robots from indexing.  Although
Google and Duck Duck Go, to the best of my knowledge, do respect it, it
is no panacea. So I don't think robots.txt solves the problem.

Also, as Alfred said, it feels quite weird to me to try to restrict
public pages to "robots". If a page is readable by a random human member
of the public, philosophically it seems to me it should also be readable
by robots (resources permitting, which is not the issue here).

Thus, using a separate and private repo like www-fr as Therese suggested
sounds to me like the best solution, both technically and
philosophically. I'm sure it is possible somehow to restrict viewing the
www-fr web pages to www members. Should be much easier than with a
subdirectory (/staging) of a public repo, seems to me.

My $.02, FWIW ... -k

Reply via email to