--- Merlin <[EMAIL PROTECTED]> wrote:
> So they are saving cpu and db_traffic by writing html pages out?

Pretty much, yeah, though I don't think the pages are completely static HTML
either, but the majority of the logic is done in batches.

It was basically a common sense approach taken a long time ago when the
following was considered:

1. We generate content for every single user.
2. Many users are receiving pages where 99% of the content is identical.

> How are they making sure that they are up to date?

It's part of the configuration. If you download their source code and use it on
your own site, you will probably use a different configuration than they use
(unless your site is very busy).

If memory serves correctly, there are Perl daemons constantly running that do
the generating. If you can tolerate minute intervals between refreshes, you
could just use cron and some shell scripts (in PHP or whatever), which would be
simpler to implement. A lot of PHP developers use some sort of server-side
caching within their own applications - whatever pieces of data rarely change
but are having to be generated over and over.

> Do u know any good tutorials on that?

No, but I'm sure someone has written some if you search.

Hope that helps.

Chris

=====
Become a better Web developer with the HTTP Developer's Handbook
http://httphandbook.org/

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to