>>>>> On Sat, 17 Jun 2000 17:45:24 -0500, Elaine -HFB- Ashton <[EMAIL PROTECTED]> 
>said:

 > No, but not bad performance for an U1/140 since most of the traffic
 > involves database queries instead of plain html. I've seen sites that do
 > horrible with less traffic. My point was that it should scale reasonably
 > well without too much trouble and be fairly easy to manage for mirrors.

My favorite answer to that is to

- write HTTP headers optimized for caching (I wrote a chapter about
  how to do that for Stas Bekman's guide to mod_perl)

- install a squid-accelerator in front of the webserver. The squid can
  run on the same box and is very easy to configure and manage. (It is
  very easy to misconfigure too, so better let me have a look on the
  configuration some time:)

The order in which you deal with the two measures doesn't matter, each
makes a little bit of sense standalone, but combined they are very
powerful. Once you've succeeded in doing both, you most definitely
will never again think about restrictions as discussed below, you
won't care about most robots and gatherers. You are not invulnerable
with this setup, but only maleficient people hurt you, not stupid
robots.

 > *>Yes it probably will. If it really becomes a problem we can put some kind
 > *>of dynamic restriction on it.

 > Not a bad idea though it could either be implemented at the web server
 > level or the database, the latter is probably more reliable and more
 > portable. 

-- 
andreas

Reply via email to