On Jul 14, 2009, at 2:45 PM, ML wrote:
Hi Doug,
So does that prevent crawling and browsing, but does allow if I
click a link or include a file it will work?
No, it prevents the directory and all files within from being
served by the webserver at all. Anything else is either half-secure
or half-broken
I don't understand what you mean by "crawling", "browsing", and
"include a file". They're really all the same thing: A client (be
it Firefox or GoogleBot) is asking the webserver for something.
If you want to prevent the nice robots from asking for something,
you can use a robots.txt file. This will not prevent naughty robots
from asking for something.
Lets think about this a different way.
Say I have a directory of files that contain my MySQL connection
information, queries, etc, etc.
How do I prevent people from browsing the directory but allow the
files to still be used when I include them in a page. Say to connect
to MySQL.
So in this case, you mean "include" a la PHP's include()?
Just because Apache can't serve the file does not mean that Perl, PHP,
Python, or other things cannot USE the file. Apache's access controls
only determine what Apache will do with the file when the file is
requested by a user.
For example, you could use Apache SSI to include a file that the user
would not be able to see by directly trying to access the file.
Doug Bell -- Senior Developer, Plain Black Corp.
[ http://plainblack.com ]
all that groks is
---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
" from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org