Hello! I'm new on the list. I'm Sven Lauritzen from Hamburg/Germany.
On Fri, 2003-02-07 at 07:45, sean finney wrote: > i'm packaging sugarplum, an email harvester honeypot basically. in > order to not trap legitimate web-spiders, i thought it'd be good to > make the install of a robots.txt[1] in /var/www happen by default if > possible, only i'm not sure i can/ought to really do that. Maybe you can use the robots-meta-tag instead: <meta name="robots" content="noindex"> Regards Sven (I'm neither a DD nor an applicant). -- Sven Lauritzen ---------------------------------------------------------------- pub 1024D/95C9A892 sub 1024g/D30E490F Fp 2FA9 FC9B 078C 5BC7 87DC 0B0D 2329 94F6 95C9 A892 ----------------------------------------------------------------
signature.asc
Description: This is a digitally signed message part