On Sat, 6 Sep 2003, David Harel wrote: > Hi, > > I am not sure I am using the correct terminology. I am looking for a > tool that can download HTML pages and content given a URL as starting > point. On MS I used teleport pro (Got a license) What tool is available > on Linux? >
There are several such tools on Linux. The most prominent one is the command line wget, which may have some third-party GUIs (kget ?). There's also curl, but I'm not sure if it can retrieve more than one page at a time. Perl has a nice library for retrieving documents from the web called libwww-perl or LWP for short. It ships with several command line clients. Aside from all that, there's pavuk. I'm not sure if it is still maintained, but nevertheless works nicely as it is. It ships with its own GUI. All these tools (except maybe pavuk) are also available for Windows, BTW. Regards, Shlomi Fish > -- > Thanks. > > David Harel, > > ================================== > > Home office +972 4 6921986 > Fax: +972 4 6921986 > Cellular: +972 54 534502 > Snail Mail: Amuka > D.N Merom Hagalil > 13802 > Israel > Email: [EMAIL PROTECTED] > > > > > ================================================================= > To unsubscribe, send mail to [EMAIL PROTECTED] with > the word "unsubscribe" in the message body, e.g., run the command > echo unsubscribe | mail [EMAIL PROTECTED] > ---------------------------------------------------------------------- Shlomi Fish [EMAIL PROTECTED] Home Page: http://t2.technion.ac.il/~shlomif/ There's no point in keeping an idea to yourself since there's a 10 to 1 chance that somebody already has it and will share it before you. ================================================================= To unsubscribe, send mail to [EMAIL PROTECTED] with the word "unsubscribe" in the message body, e.g., run the command echo unsubscribe | mail [EMAIL PROTECTED]