There's 1 really important thing missing in PHP as I see it, and it's the ability to keep variables in memory for as long as the programmer choose. If this was possible there could be some truly great optimizations done. Some things are very slow to create but very fast to work with. I wrote a XML class a couple of days ago and while it's extremly quick to search and work with, sadly it's rendered pretty much useless since creating the tree which it uses isn't fast enough.
I've heard there's a feature like this in Cold Fusion, which every Cold Fusion user seems to think of as the holy grail, and I would have to agree with them. One thing I've heard they use this for is to load an entire database into system memory. I don't know exactly how it's works but imagine having the whole database in system memory. When you change data you update it both in system memory and on the drive, but when you select (which is what you mostly do), you just query the mirror in system memory. So how cost effective could this be? 1GB of system memory is pretty much minimum on a decent server today. Assuming the site generates aprox 1 million bytes worth of data every day (storing images and other types of massive data in the tables would perhaps not be apropiate) the site could be up and runing for 1 thousand days. And if you just keep tables that gets queried a lot, but doesn't get altered often, you could most likely come up with a great compromise. I can't say for sure how much faster things would be but I'm guessing at several 1000% faster, however I might be way off. The only drawback I can see is that there might be multi threading issues, so if this would be implemented a new key word would probably have to be introduced to make data mutexed, or perhaps the other way around to avoid to many people scratching their heads. /Sebastian Karlsson -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php