Tony Godshall wrote:
>
> The point is, I would like to have these interesting sites
> snarfed onto my laptop to read offline and then have the
> URLs for the interesting links queued up in some way.
> I suppose what I need is some kind of trigger from the web
> browser. Maybe an offline web proxy replacement that just
> appends the URL to a file and then puts up a page that says
> "page request queued". When I connect back to the web, it
> could stop the offline proxy and fire up squid or whatever
> normal proxy (or even reconfig the browser for no-proxy?)
> and run through the URL file doing a wget for each URL.
>
I woud use wget:
wget -r -k -H -l X -nc http://google-search-results
wher X is the level you like to (travers?) the links
ksieben
--
ingo dross infomation/security architecture
[]___¸
######-\
O_-_-_O-\
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]