On Sun, Aug 31, 2008 at 05:02:01PM +0100, Philip wrote: > I'm looking for a tool which spiders a site, and downloads every page in > the domain that it finds linked from a particular url and linked urls > in the domain, creating a local site that can be manipulated offline as > static html. > > Is there such a tool for linux (better still debian)? >
wget should do the trick Philippe -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]