Hi Doug,

So does that prevent crawling and browsing, but does allow if I click a link or include a file it will work?


No, it prevents the directory and all files within from being served by the webserver at all. Anything else is either half-secure or half- broken

I don't understand what you mean by "crawling", "browsing", and "include a file". They're really all the same thing: A client (be it Firefox or GoogleBot) is asking the webserver for something.

If you want to prevent the nice robots from asking for something, you can use a robots.txt file. This will not prevent naughty robots from asking for something.

Lets think about this a different way.

Say I have a directory of files that contain my MySQL connection information, queries, etc, etc.

How do I prevent people from browsing the directory but allow the files to still be used when I include them in a page. Say to connect to MySQL.

-Jason


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org

Reply via email to