On Fri, 21 Mar 2003, Anton Tichawa wrote:
> On Friday 21 March 2003 14:02, you wrote: > > On Fri, Mar 21, 2003 at 02:04:39PM +0100, Michael Van Canneyt wrote: > > > On Fri, 21 Mar 2003, Anton Tichawa wrote: > > > > On Friday 21 March 2003 13:09, you wrote: > > > > > > On Friday 21 March 2003 12:29, you wrote: > > > > > >> >> > But, when I use fail in my simple example program, it returns > > > > > >> >> > > > > > >> >> NIL okay but > > > > > >> >> > > > > > >> >> > the Heaptrace function tells me I have two unfreed memory > > > > > >> >> > blocks > > > > > >> > > > > > >> (36 > > > > > >> > > > > > >> >> > bytes). > > > > > >> >> > I can't see a memory leak anywhere else in that program, what > > > > > >> >> > > > > > >> >> could cause > > > > > >> >> > > > > > >> >> > this? (Heaptrace output is as follows: I am using FPC 1.0.6 > > > > > >> >> > btw) > > > > > >> >> > > > > > >> >> [snip heap dump] > > > > > >> >> > > > > > >> >> That is the exception frame that is left on the heap. I don't > > > > > >> >> have > > > > > >> > > > > > >> the > > > > > >> > > > > > >> >> time to analyze what the cause is that the exception frame is > > > > > >> >> not removed. > > > > > >> > > > > > > >> > It may be a bug in 1.0.6 which has subsequently been fixed. I > > > > > >> > > > > > >> downloaded > > > > > >> > > > > > >> > and > > > > > >> > installed the 1.1 snapshots and compiled the same source code, > > > > > >> > and the memory leak vanishes.... > > > > > >> > > > > > >> The reason why 1.1 has no leak is that it uses the stack to store > > > > > >> the exception frames. The real problem is still there, because > > > > > >> exception stack > > > > > >> is still not updated > > > > > > > > > > > > some months ago i had a discussion with a friend, converning global > > > > > > (static, > > > > > > absolute) variables. his point of view was that they're not > > > > > > necessary when using oop; mine was, sometimes they're absoulutely > > > > > > necessary. > > > > > > > > > > > > if we have just one level of exception processing above normal > > > > > > program execution (i. e. while an exception is being processed, no > > > > > > other exception will gain control), we can use absolute variables > > > > > > for the exception frame. > > > > > > > > > > > > it's even possible to define a fixed whole number of exception > > > > > > layers and allocate absolute memory for N exception levels. > > > > > > > > > > > > that memory space would not get lost, as it can be saved by > > > > > > allocating the 'normal' stack or 'normal' heap more tightly - the > > > > > > old system has to reserve > > > > > > exception spae implicitely on the stack or on the heap. > > > > > > > > > > > > what do you think about that? > > > > > > > > > > It does not fix the problem, the frame is then still left on the > > > > > stack. > > > > > > > > > > The allocation on the heap has already been changed to allocation on > > > > > the stack in 1.1, because hat is much faster. Using a predefined > > > > > storage of N exception levels is adding a limit and that is something > > > > > we want to prevent. > > > > > > > > but also the power-switch, the data bus width, and the exception > > > > vectors in ROM now are limits. i think limits cannot be prevented, but > > > > they can be chosen knowingly, harmonic, and safe or so. every limit > > > > should include the overhead to overcome it later, as things get better. > > > > > > Not in this case. For instance recursive routines will get in trouble. > > > There is no way to know how deep the stack can be nested, so you cannot > > > foresee this. Putting a limit on that is out of the question. > > > > Agreed. If you put a limit on that, you will disallow algorithms that > > use resursive loops. There are many that do ... > > provided that noone has posted that before, and that safe email processing > allows me to post another argument, i'd say that: > > if memory is limited, unlimitedly recursive procedures are responsible to > check stack space, aren't they? > > but if it's possible to define a maximum N, the check need be made only once, > and the procedure will run faster. there's a kind of tradeoff between freedom > and efficiency, isn't there? No. If you want to define a maximum N, you need to allocate this at once to solve the problem. This will blow up your executable for any reasonable sized N even when it is not necessery. > > also, if i remember right, anything that can be done recursively can also be > done by applying a non-recursive algorithm. This is not correct. This is only true for tail recursion. Please don't try to convince us to do this; We won't, it's an ugly solution. Instead we'll try to fix the original problem. Michael. _______________________________________________ fpc-pascal maillist - [EMAIL PROTECTED] http://lists.freepascal.org/mailman/listinfo/fpc-pascal