Philip wrote:
I'm looking for a tool which spiders a site, and downloads every page in the domain that it finds linked from a particular url and linked urls in the domain, creating a local site that can be manipulated offline as static html.Is there such a tool for linux (better still debian)? thanks, Philip
look for the httrack package. WT --To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]