Hello,
I'm being trying to package DJGPP, which is a libc, and a few other
tools to make DOS binaries with an i*86-msdosdjgpp targetted GCC (see
the recent ITP on -devel). I'd like to see it sponsored into Debian when
it's done.
I've already packaged a few private things, but DJGPP is just no suc
Hello!
I'm new on the list. I'm Sven Lauritzen from Hamburg/Germany.
On Fri, 2003-02-07 at 07:45, sean finney wrote:
> i'm packaging sugarplum, an email harvester honeypot basically. in
> order to not trap legitimate web-spiders, i thought it'd be good to
> make the install of a robots.txt[1] in
Hi,
I am having a problem using debconf in my postinst:
#! /bin/sh
set -e
. /usr/share/debconf/confmodule
case "$1" in
configure)
db_get qmail-scanner/admin
ADMIN="$RET"
db_get qmail-scanner/domain
DOMAIN="$RET"
db_get qmail-scanner/notify
NOT
Hello,
I'm being trying to package DJGPP, which is a libc, and a few other
tools to make DOS binaries with an i*86-msdosdjgpp targetted GCC (see
the recent ITP on -devel). I'd like to see it sponsored into Debian when
it's done.
I've already packaged a few private things, but DJGPP is just no suc
Hello!
I'm new on the list. I'm Sven Lauritzen from Hamburg/Germany.
On Fri, 2003-02-07 at 07:45, sean finney wrote:
> i'm packaging sugarplum, an email harvester honeypot basically. in
> order to not trap legitimate web-spiders, i thought it'd be good to
> make the install of a robots.txt[1] in
Hi,
I am having a problem using debconf in my postinst:
#! /bin/sh
set -e
. /usr/share/debconf/confmodule
case "$1" in
configure)
db_get qmail-scanner/admin
ADMIN="$RET"
db_get qmail-scanner/domain
DOMAIN="$RET"
db_get qmail-scanner/notify
NOT
heya mentors,
(description follows)
all done... wrt the previous thread, i ended up deciding that it would
be a waste to put in the debconf warning message about robots.txt, since
it's already present in the docs as well as warnings in the config files.
so... would someone be willing to sponsor t
heya mentors,
(description follows)
all done... wrt the previous thread, i ended up deciding that it would
be a waste to put in the debconf warning message about robots.txt, since
it's already present in the docs as well as warnings in the config files.
so... would someone be willing to sponsor t
On Fri, Feb 07, 2003 at 05:33:20PM +0100, Ola Lundqvist wrote:
> Well if the docs is already there things are quite set already.
> Well low is a good priority for such things, I think. Maybe a little
> higher if you think it is more important.
okay, i think low is acceptable for this. now that i
Hi
On Fri, Feb 07, 2003 at 11:10:03AM -0500, sean finney wrote:
> On Fri, Feb 07, 2003 at 03:25:14PM +0100, Ola Lundqvist wrote:
> > 1) Document that robots.txt should be copied to the proper place
> >in the README.Debian file.
> > 2) Tell the user to do that in a debconf box, or even to ask f
On Fri, Feb 07, 2003 at 03:25:14PM +0100, Ola Lundqvist wrote:
> 1) Document that robots.txt should be copied to the proper place
>in the README.Debian file.
> 2) Tell the user to do that in a debconf box, or even to ask for
>where to install it.
the /usr/share/doc docs are already there,
On Fri, Feb 07, 2003 at 05:33:20PM +0100, Ola Lundqvist wrote:
> Well if the docs is already there things are quite set already.
> Well low is a good priority for such things, I think. Maybe a little
> higher if you think it is more important.
okay, i think low is acceptable for this. now that i
Hi
On Fri, Feb 07, 2003 at 11:10:03AM -0500, sean finney wrote:
> On Fri, Feb 07, 2003 at 03:25:14PM +0100, Ola Lundqvist wrote:
> > 1) Document that robots.txt should be copied to the proper place
> >in the README.Debian file.
> > 2) Tell the user to do that in a debconf box, or even to ask f
Hello
I have a simple solution (or two actually).
1) Document that robots.txt should be copied to the proper place
in the README.Debian file.
2) Tell the user to do that in a debconf box, or even to ask for
where to install it.
I think it is a really bad idea to install it in /var/www. Fir
On Fri, Feb 07, 2003 at 03:25:14PM +0100, Ola Lundqvist wrote:
> 1) Document that robots.txt should be copied to the proper place
>in the README.Debian file.
> 2) Tell the user to do that in a debconf box, or even to ask for
>where to install it.
the /usr/share/doc docs are already there,
Hello
I have a simple solution (or two actually).
1) Document that robots.txt should be copied to the proper place
in the README.Debian file.
2) Tell the user to do that in a debconf box, or even to ask for
where to install it.
I think it is a really bad idea to install it in /var/www. Fir
En réponse à Sven Luther <[EMAIL PROTECTED]>:
> On Thu, Feb 06, 2003 at 10:07:03PM +0100, Jérôme Marant wrote:
> >
> > Hi,
> >
> > Non-free packages don't seem to be autobuilt.
> > Do I have to contact every autobuilder maintainer
> > in order to get a non-free package built on every
> >
On Thu, Feb 06, 2003 at 10:07:03PM +0100, Jérôme Marant wrote:
>
> Hi,
>
> Non-free packages don't seem to be autobuilt.
> Do I have to contact every autobuilder maintainer
> in order to get a non-free package built on every
> architecture?
You could also log in one of the numerous debia
heya mentors,
i'm packaging sugarplum, an email harvester honeypot basically. in
order to not trap legitimate web-spiders, i thought it'd be good to
make the install of a robots.txt[1] in /var/www happen by default if
possible, only i'm not sure i can/ought to really do that.
if i made it a conf
En réponse à Sven Luther <[EMAIL PROTECTED]>:
> On Thu, Feb 06, 2003 at 10:07:03PM +0100, Jérôme Marant wrote:
> >
> > Hi,
> >
> > Non-free packages don't seem to be autobuilt.
> > Do I have to contact every autobuilder maintainer
> > in order to get a non-free package built on every
> >
On Thu, Feb 06, 2003 at 10:07:03PM +0100, Jérôme Marant wrote:
>
> Hi,
>
> Non-free packages don't seem to be autobuilt.
> Do I have to contact every autobuilder maintainer
> in order to get a non-free package built on every
> architecture?
You could also log in one of the numerous debia
21 matches
Mail list logo