Hi,

Sorry, was imprecise, I meant not save the downloaded page locally.
There probably isn't one though, so I should build one myself.
Probably just need a good crawler that can be set to dump all links
into dataset that I can analyse with R.

Cheers,
Bryan Rasmussen

On 6/19/06, Marc 'BlackJack' Rintsch <[EMAIL PROTECTED]> wrote:
> In <[EMAIL PROTECTED]>, bryan rasmussen
> wrote:
>
> > It should hopefully be as high level as Wget, not download the pages
> > but just follow the links, and output graphs.
>
> How do you get at the links without downloading the page!?
>
> Ciao,
>        Marc 'BlackJack' Rintsch
> --
> http://mail.python.org/mailman/listinfo/python-list
>
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to