On Tuesday 25 November 2008 19:21:46 Paul Wise wrote:
> On Tue, Nov 25, 2008 at 7:18 AM, Jonathan Wiltshire
>
> <[EMAIL PROTECTED]> wrote:
> > Do you think it is severe enough to be put forward for lenny?
>
> I'm not sure. The patch looks straight-forward enough, so perhaps the
> release team would
On Tue, Nov 25, 2008 at 7:18 AM, Jonathan Wiltshire
<[EMAIL PROTECTED]> wrote:
> Do you think it is severe enough to be put forward for lenny?
I'm not sure. The patch looks straight-forward enough, so perhaps the
release team would approve it.
PS: it is a good idea to document in your patches wh
On Mon, Nov 24, 2008 at 08:07:45PM +0900, Paul Wise wrote:
> http://www.debian.org/doc/developers-reference/pkgs.html#t-p-u [1]
> http://lists.debian.org/debian-devel-announce/2008/09/msg0.html [2]
Ok, I understand the theory according to [1]. But [2] states that bugs
should be at severity cri
On Mon, Nov 24, 2008 at 8:00 PM, Jonathan Wiltshire
<[EMAIL PROTECTED]> wrote:
> Can you advise further or point me to somewhere to find out more? I've
> never used testing-proposed-updates, but it would be a good thing to
> learn about.
http://www.debian.org/doc/developers-reference/pkgs.html#t-
On Mon, Nov 24, 2008 at 06:13:23PM +0900, Paul Wise wrote:
> Sounds fine. Be sure to prepare an upload for testing-proposed-updates
> if you want to add the patch to lenny and the release team are likely
> to approve the patch, since your package is not in sync between lenny
> & sid.
Can you advis
On Monday 24 November 2008 10:13:23 Paul Wise wrote:
> On Mon, Nov 24, 2008 at 5:59 PM, Jonathan Wiltshire
>
> <[EMAIL PROTECTED]> wrote:
> > My adopted package gxemul has a bug in Ubuntu[1] (seg fault on command
> > line parameters) which reproduces in the current Debian package. It has
> > been f
On Mon, Nov 24, 2008 at 5:59 PM, Jonathan Wiltshire
<[EMAIL PROTECTED]> wrote:
> My adopted package gxemul has a bug in Ubuntu[1] (seg fault on command
> line parameters) which reproduces in the current Debian package. It has
> been fixed recently in upstream's CVS so I plan to add a patch for the
On May 30, Joey Hess ([EMAIL PROTECTED]) wrote:
> Neil Roeth wrote:
> > Thanks for the hints. I should have been more clear - I have no problem
> > getting the main page, i.e., [EMAIL PROTECTED] There are
> > links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug= for each
> > bug,
Neil Roeth wrote:
> Thanks for the hints. I should have been more clear - I have no problem
> getting the main page, i.e., [EMAIL PROTECTED] There are
> links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug= for each
> bug, and I want to get each of those as a local web page, too. That
On May 30, Joey Hess ([EMAIL PROTECTED]) wrote:
> Neil Roeth wrote:
> > Thanks for the hints. I should have been more clear - I have no problem
> > getting the main page, i.e., [EMAIL PROTECTED] There are
> > links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug= for
> > each
> >
Neil Roeth wrote:
> Thanks for the hints. I should have been more clear - I have no problem
> getting the main page, i.e., [EMAIL PROTECTED] There are
> links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug= for each
> bug, and I want to get each of those as a local web page, too. That
On May 28, Craig Small ([EMAIL PROTECTED]) wrote:
> I think it is to do with robots.txt
> Try
> wget -r -l 1
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED]
>
> It nearly does what you want.
On May 28, Bastian Kleineidam ([EMAIL PROTECTED]) wrote:
> On Wed, May 28, 2003 at 06:49:50AM -
On May 28, Xavier Roche ([EMAIL PROTECTED]) wrote:
> Hi,
>
> On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> > I'd like to download the web page of bugs by maintainer,
> > http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> > the bug reports linked to on that page, so t
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Neil,
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to t
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to them offline.
> But, wget doesn't work, I think because
Hi,
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to them offline.
> But, wget doesn't work, I think be
On May 28, Craig Small ([EMAIL PROTECTED]) wrote:
> I think it is to do with robots.txt
> Try
> wget -r -l 1
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED]
>
> It nearly does what you want.
On May 28, Bastian Kleineidam ([EMAIL PROTECTED]) wrote:
> On Wed, May 28, 2003 at 06:49:50AM -
On May 28, Xavier Roche ([EMAIL PROTECTED]) wrote:
> Hi,
>
> On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> > I'd like to download the web page of bugs by maintainer,
> > http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> > the bug reports linked to on that page, so t
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Neil,
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to t
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to them offline.
> But, wget doesn't work, I think because
Hi,
On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to them offline.
> But, wget doesn't work, I think be
21 matches
Mail list logo