On Nov 30, 2003, at 11:25 PM, R. Joseph Newton wrote:
drieux wrote:[..]
Pathological cases can arise within any system.
yes, but they keep allowing the primates to code... 8-)
[..]
I haven't really searched on the problem, because it hasn't arisen. Until this morning, when I cooked up that memory-grabber for the sake of this experiment, I had never seen the Perl process take more than about 4 MB of memory.at any one time. Just found it, with the more-terse "perldoc -q free". Cool. Sounds like its not just on NT that programs retain their memory allocations.
Think about what happens when you Perl Extension to an existing client api happens to have about 10,000 instances of objects doing say a couple million transactions.... you know, to do a statistically relevant delta between the original c client and the perl and java interfaces.
You might want to go back and review the URL on the 'unix philosophy' and remember that it is a weltanschaung that is NOT tied to one given OS architecture. <http://www.faqs.org/docs/artu/ch01s06.html>
To be honest, I envy your enjoyment of this problem. Since I remember the day when I had that 'evil homer simpson' moment when it Struck Me that 'duh!' OF course the code has called sbrk() - cf man sbrk -
<rant>
"The brk and sbrk functions are historical curiosities left over from ear-
lier days before the advent of virtual memory management."
HA! kids these days, now they insult us by calling us 'historical curiosities' in the Man Pages... Wasn't like that when I was growing up, kids had respect for their elders... Why when back in the Good Old Days RealMen[tm] were not afraid to do their own memory management the old fashion way with Cold Iron they had smelted themselves..... </rant>
sorry for the minor rant there... it's just that I dislike the 'lack of respect' in the Man Page ...
But it also points to the we have good news and we have information...
Most of the time Perl liberates the coder from having to do many of the dull boring and tedious bits of keeping track of 'memory issues'. The trade off as you note is the overhead involved with having all that structure around the variables in use, and with it the problem of "ref, ref, who incr'd the refcount" and then having to chase it down, AND KILL IT....
[..]
Generally, we just say: "Don't do that".
and when we are being polite we say
"I wouldn't do that, IT HURTS!"
all too often because of those 'painful memories' of when WE did that 'oye, now that was DUMB!'
Folks need to remember that 'best practice' comes with the rest of the set 'not so best practice' and 'even less better practice' and 'OH that HURTS!'...
there are many 'unsolvable problems' that are always amusing to watch folks seek solutions for, 'the church-turing thesis' cf <http://plato.stanford.edu/entries/church-turing/> AKA 'The Halting Problem' <http://en2.wikipedia.org/wiki/Halting_problem> and 'how do I find the dangling pointer' problem that started this thread being two of my favorites...
and about the best that one can offer are the reminders of 'best practice' that generically avoid most of the STOOPIDS.
[..]
[..]It definitely is something to keep in mind, but probably somewhere in the back of your mind. Perl has a lot of overhead for each variable, and each variable is solid as a tank. A very fair trade-off. Unless you are trying, as I was, to grab a big chunk of memory, it usually isn't going to happen.
Back to why design leads to scoping is an Important Idea. <http://perl.plover.com/FAQs/Namespaces.html>
Most 'scripters' have the luxury that their code comes and goes quick enough that such things as remembering to close file handles, and manage memory, is not a relevant factor in the overall process. But that also leads to lazy coding which in the long run will bite the coder.
But today's quick trick piece of code will wind up being 'maintained' by someone - so growing one's skill level will help in the long run.
ciao drieux
---
-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]