--- Luis Lebron <[EMAIL PROTECTED]> wrote:
> I am currently working on an application for a customer that may have
> a very large amount of users (10,000 or more according to the customer).

I currently design, develop, and maintain a suite of Web applications and
utilities that receive ten million hits a day, and my experience has shown
that the number one thing you can do to make the biggest difference is to
limit the number of times you need to hit the database.

PHP itself, even without an accelerator, is very fast. A single Web server
can handle most people's needs, so long as the application is designed
well otherwise.

As an example of limiting database access, consider whether your
application queries the database many times to receive the exact same
result set. Is there a way to cache that locally? Or, perhaps you are
generating statistics for some reason, where you need to record data for
every visitor. What if, instead, you stored such statistical data once for
every 100 visits? Assuming the rand() function is very good, this should
allow you to multiply your statistics by 100 and have fairly accurate
statistics (assuming large data sets, like saying you have 1.4 million
users from the US). Your accuracy diminishes by a factor of 100, so you
just have to determine what amount of accuracy you need.

There are a lot of things you can do, but I have found that performance
tuning your PHP logic can be very helpful, but it's nothing compared to
limiting your database access.

Hope that helps.

Chris

=====
My Blog
     http://shiflett.org/
HTTP Developer's Handbook
     http://httphandbook.org/
RAMP Training Courses
     http://www.nyphp.org/ramp

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to