Quoting jd1008 :
SOme websites which provide files for download, refuse wget
They send a file robots.txt instead.
In that file I see:
User-agent: *
Disallow: /
Allow: /lang/*
Allow: /$
So, I ask if anyone knows how to use wget to get those files without the use
of FF ?
wget -e robots=off
SOme websites which provide files for download, refuse wget
They send a file robots.txt instead.
In that file I see:
User-agent: *
Disallow: /
Allow: /lang/*
Allow: /$
So, I ask if anyone knows how to use wget to get those files without the use
of FF ?
--
users mailing list
users