Hello Christian,

  caching? There is nothing to cache. And even if we would do that we would
make every string an object since we would need to invalidate the position
cache on write operations. Also i agree with the others that most common
usage would be accessing a few chars probably changing them.

And *I* never had code where I used the same position twice. Besides the all
time favorite search for backlsash and forward slash. But that can be done
better using the right search functions anyway.

Also looking for backslashes and changing them to forward slashes can be done
with iterators. Then checking if the second char is a ':' (common usecase
under windows) is best done with [], but that's a one time read access.

The place caching and its optimization effect i see left is sequential
scanning. But for all of that iterators and functions are much better.

So i am convinced that the cache would only blow up the code, make everything
much more complex and in the end slow down php.

best regards
marcus

Friday, February 3, 2006, 2:19:27 AM, you wrote:

> Andrei Zmievski wrote:
>> I am not sure how we can optimize [] to be faster than the iterator 
>> approach. Food for thought?

> You could cache the last position (PHP- and Unicode string index) and 
> start from there. This assumes that most accesses are (more or less) 
> sequential. If you can step backward as well as forward you could use 
> the cached version for both directions but even if you can only go 
> forward it would cover the most common case I guess.

> Very simple idea but maybe it helps,
> - Chris




Best regards,
 Marcus

-- 
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to