On May 28, Craig Small ([EMAIL PROTECTED]) wrote: > I think it is to do with robots.txt > Try > wget -r -l 1 > http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED] > > It nearly does what you want.
On May 28, Bastian Kleineidam ([EMAIL PROTECTED]) wrote: > On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote: > > I'd like to download the web page of bugs by maintainer, > > http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all > > the bug reports linked to on that page, so that I can refer to them offline. > > But, wget doesn't work, > What's the error message? At least it works for me: > # wget "http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED]" > - --13:53:00-- http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED] > => [EMAIL PROTECTED]' Thanks for the hints. I should have been more clear - I have no problem getting the main page, i.e., [EMAIL PROTECTED] There are links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug=<num> for each bug, and I want to get each of those as a local web page, too. That is the part that seems to require more than a simple wget command. -- Neil Roeth -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]