Yes, compression is key. Faster servers with HUGE internet connections serving COMPRESSED content make sites feel snappy. Look under microtime() in the php manual for an example that will benchmark how much time your page actually takes to produce... you'll probably find it incredibly small... like 0.02 seconds or something if it's a decent shared host like mine. The rest is compression (reduces data transfer), page complexity (simpler HTML shows up faster, fewer images make the page load faster), and internet latency/bandwidth (fast, fat pipes are way better than a site hosted off DSL - latency in packet delievery alone could cost 1/2 second in some situations).

So before you look into PHP accelerators, benchmark your code. It's not likely your PHP, it's more likely your too complex HTML and/or no compression and/or a slow (either bandwidth or latency) internet connection.

Even 1/4 second script parsing feels peppy with a big, low-latency pipe, simple HTML formatting, and on-the-fly compression. Just look at Google.

-Galen

On Feb 10, 2004, at 5:46 AM, Manuel Lemos wrote:

Hello,

On 02/09/2004 11:36 PM, Merlin wrote:
I do have a performance question regarding php processing. There are some sites on the net like google, heise and others which are incredible fast in response time. Of course companies like google are running totally other systems, but there are smaller sites and companies doing almost the same verry fast page delivery.
Now I am wondering how this is possible on a LAMP application. My site is not slow, but also not fast. I was reading about php accellerator. Can anybody recommend this tool, is it complicated to install and configure? Are there any speed gains and are ther tools to messure the page delivery times?

Most details about Google and other sites speed do not have much to do the actual code that generates the pages. For instance, they use distributed data centers and DNS around the world to make the servers that you actually access be closer to you and start serving pages sooner.


They also use compressed pages to reduce the actual size of the information that is transferred. You can use compression of pages in PHP too, but in my experience it is much simpler to use a module of your Web server that does that job transparently, like for instance mod_gzip for Apache.

Search engines also do not use SQL databases. Usually they just lookup static indexes to speedup lookup.

Sites like Google also avoid using many images. Images can make a site look nicer but they are often irrelevant and retard the site loading.


--


Regards,
Manuel Lemos

Free ready to use OOP components written in PHP
http://www.phpclasses.org/

Metastorage - Data object relational mapping layer generator
http://www.meta-language.net/metastorage.html

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php



Reply via email to