Perhaps you could help me:

I'm looking for a simple web search tool or spider to map all the links 
from a certain URL starting point(s) to a certain depth, and print or 
log the URLs it encounters. I've been using "wget -r -l 5" so far but it 
cannot parallelize its requests, visits the same URLs several times and 
cannot be told to refrain from downloading files unnecessarily, which 
makes it far too slow.  Does anyone have experience with or knows a tool 
that does have these features (especially parallelizing requests)?

TIA,
Adi Stav 


=================================================================
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]

Reply via email to