On 07/28/2012 12:37 AM, Lester Caine wrote:
> Rasmus Lerdorf wrote:
>>>> I don't think this generator question is any different. We need to
>>>> >>explain generators in the simplest way possible. The simplest way to
>>>> >>explain generators is to not even worry about them being
>>>> generators at
>>>> >>all. Simply say that functions can now return arrays one element at a
>>>> >>time using the new yield keyword. That's all.
>>> >
>>> >It's this 'concept' that I am having trouble seeing in the general
>>> >process that is required using PHP to generate web pages. At the end of
>>> >the day I have to generate the finished page or sub-page so I need all
>>> >the results anyway.
>> Sure, but that doesn't mean it has to all be in memory at the same time.
>> You can read lines from a large file line-by-line, process that line and
>> output the result before you move onto the next line.
> 
> Exactly ... when uploading the NLPG csv files I process them line at a
> time and store to the database. I always have ... which is why I don't
> recognise the initial 'complaint' that justified adding this. These
> files can be 100Mb+ so there is no way one would process them by reading
> the whole lot in as the 'example' gave. You just call the function that
> processes the particular type of line which is based on the first two
> characters ... after reading the line.

Great, so you recognize the use-case.  Now you just need to get to the
next step which is that instead of having the iterating part call out to
the processors for each iteration sometimes it is convenient to have it
be processor-centric and have the processors be able to call the
iterator to get the next element without duplicating that code in every
processor.

-Rasmus


-- 
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to