On 11 May 2007, at 17:57, Lionel MARTIN wrote:
Lionel MARTIN wrote:

- Don't load large amounts of data into scalars.
Fine. Now I know why. But sometimes, you don't have the choice.

I'd like to know what situations you encounter where you are forced to load
large amounts of data into scalars. I can't think of any.

I don't have any clear situations right here in mind, but we could imagine many: -for example, a bulletin board system, where you are retrieving posted message from a DB. Each message could weigh several dozens of kilo. (especially if you store HTML formatting in the DB)

Yeah, but presumably you'd store those in a data structure of some sort rather than a scalar. Any non-trivial data structure (for example an object...) will mean you're automatically using references rather than storing the data directly in lexical (or global) scalars.

-another example hat comes to my mind is a project (implemented in PHP) where I had to work. Part of the process was retrieving cached HTML template pages from the server and do BB tags parsing before serving the client. Of course, you would tell me that I could retrieve the data chunk by chunk, but this is not as obvious, as some BB tags could spread over several lines. So, this would need to devise an algorithm to be sure we are not cutting in the middle of a tag. In this kind of situation and if the file to be retrieved don't grow too large, I would prefer to retrieve the file all at once, do the processing, serve it to the client, and then, undef the buffer. Far easier than doing chunk by chunk.

Same thing: you'd presumably wrap that data in an object.

--
Andy Armstrong, hexten.net

Reply via email to