On 17/02/2016 at 01:39, Jason Morris wrote:
> What would be the proper config of robots.txt to allow only Google to crawl a 
> JSPWiki installation and nothing else?

It's all in Google's robots.txt documentation

https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt

However, something like this might work for you:

# robots.txt
User-agent: Googlebot
Disallow:
User-agent: googlebot-image
Disallow:
User-agent: googlebot-mobile
Disallow:
User-agent: *
Disallow: /
Sitemap: http://www.yoursite.com/sitemap.gz


Roland
-- 
QURU Ltd, London, UK
http://quru.com

Reply via email to