sonjaa wrote:
> Serge Orlov wrote:
> > sonjaa wrote:
> > > Serge Orlov wrote:
> > > > sonjaa wrote:
> > > > > Hi
> > > > >
> > > > > I'm new to programming in python and I hope that this is the problem.
> > > > >
> > > > > I've created a cellular automata program in python with the numpy 
> > > > > array
> > > > > extensions. After each cycle/iteration the memory used to examine and
> > > > > change the array as determined by the transition rules is never freed.
> > > > > I've tried using "del" on every variable possible, but that hasn't
> > > > > worked.
> > > >
> > > > Python keeps track of number of references to every object if the
> > > > object has more that one reference by the time you use "del" the object
> > > > is not freed, only number of references is decremented.
> > > >
> > > > Print the number of references for all the objects you think should be
> > > > freed after each cycle/iteration, if is not equal 2 that means you are
> > > > holding extra references to those objects. You can get the number of
> > > > references to any object by calling sys.getrefcount(obj)
> > >
> > > thanks for the info. I used this several variables/objects and
> > > discovered that little counters i.e. k = k +1 have many references to
> > > them, up tp 10000+.
> > > Is there a way to free them?
> >
> > Although it's looks suspicious, even if you manage to free it you will
> > gain only 12 bytes. I think you should concentrate on more fat
> > objects ;)
>
>
> Sent message to the NumPy forum as per Roberts suggestion.
> An update after implimenting the suggestions:
>
> After doing this I see that iterative counters used to collect
> occurrences
> and nested loop counters (ii & jj) as seen in the code example below
> are the culprits with the worst ones over 1M:

That means you have over 1M integers in your program. How did it happen
if you're using numpy arrays? If I allocate a numpy array of one
million bytes it is not using one million integers, whereas a python
list of 1M integers creates 1M integers:

>>> import numpy
>>> a = numpy.zeros((1000000,), numpy.UnsignedInt8)
>>> import sys
>>> sys.getrefcount(0)
632
>>> b=[0]*1000000
>>> sys.getrefcount(0)
1000632
>>>

But that doesn't explain why your program doesn't free memory. But the
way, are you sure you have enough memory for one iteration of your
program?

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to