IMHO it depends on the amount of operations you have to perform on your
rows. If manipulating 2,600 rows takes several seconds, then either
you're performing some really complicated data manipulation or your db
could use some optimization. I have an application that manipulates
~150,000 rows on a Pentium II 350 MHz and that takes at most half a
second (although I'm doing really simple statistical stuff).

Cheers,


Marco
-- 
------------
php|architect - The Magazine for PHP Professionals
The monthly magazine dedicated to the world of PHP programming

Check us out on the web at http://www.phparch.com!
--- Begin Message ---
I'm working on a php/mysql ap, which looks as though it will be dealing
with an average of

160,000 rows of short text entries =
32 megs of drives space

Anybody have ideas about what if any limits I might hit? And how I might
be able to determine any limits at this point when we have only 2 weeks
(.6 megs and 2600 rows) of what will be 2 years of data? I have full
access to my linux server. I notice that when executing one set of
statements for one page, it is now using in the 80 to 90% range of CPU
states, and works for maybe 3 or 4 seconds to return the querys. I am at
present doing this testing on a 550 mhz processor, but will be running
the ap on a dual 500s.

Am I getting totally out of hand in thinking I can do this?

TIA
-- 
John Hinton - Goshen, VA.
http://www.ew3d.com

Those who dance are considered insane 
by those who can't hear the music....

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


--- End Message ---
-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to