Hi all,
does anybody know a way to make a distinction between robots and users? should I use the user agent? Or is this not a safe method. If the visitor is a spider/robot I want to include some script containing extra URL's for the robot. regards Wilbert ------------------------- Pas de Deux Van Mierisstraat 25 2526 NM Den Haag tel 070 4450855 fax 070 4450852 http://www.pdd.nl [EMAIL PROTECTED] -------------------------