Hi guys,

Just wondering why google hasn't indexed my website yet. I have a
mediawiki running on OpenBSD 4.3 behind a router that's running
OpenBSD 4.3 which has the public IP. Do I need to create a robots.txt
file in /var/www and /var/www/mediawiki to let robots know it's ok to
crawl this space. I thought robots.txt is just there to
restrict/limiting crawling.

Could PF in any way prevent crawling? I'm running two CARP'd web
servers behind two CARP'd routers just so that you know. I also have
Redirect 301 in the .htaccess of my apache root to go straight to the
mediawiki folder. I'm not sure whether robots pay attention to the
Redirect 301.

Thanks,
Vivek

Reply via email to