memory leak problem with arrays

2006-06-14 Thread sonjaa
Hi

I'm new to programming in python and I hope that this is the problem.

I've created a cellular automata program in python with the numpy array
extensions. After each cycle/iteration the memory used to examine and
change the array as determined by the transition rules is never freed.
I've tried using "del" on every variable possible, but that hasn't
worked. I've read all the forums for helpful hints on what to do, but
nothing has worked so far. I've even tried the "python memory
verification" (beta) program, which did point to numpy.dtype and
numpy.ndarray as increasing objects, before the whole computer crashed.


I can supply the code if needed. I'm desperate because this is part of
my thesis, and if I can't get this fixed, I'll try another programming
language.

thanks in advance
Sonja

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory leak problem with arrays

2006-06-14 Thread sonjaa

Serge Orlov wrote:
> sonjaa wrote:
> > Hi
> >
> > I'm new to programming in python and I hope that this is the problem.
> >
> > I've created a cellular automata program in python with the numpy array
> > extensions. After each cycle/iteration the memory used to examine and
> > change the array as determined by the transition rules is never freed.
> > I've tried using "del" on every variable possible, but that hasn't
> > worked.
>
> Python keeps track of number of references to every object if the
> object has more that one reference by the time you use "del" the object
> is not freed, only number of references is decremented.
>
> Print the number of references for all the objects you think should be
> freed after each cycle/iteration, if is not equal 2 that means you are
> holding extra references to those objects. You can get the number of
> references to any object by calling sys.getrefcount(obj)

thanks for the info. I used this several variables/objects and
discovered that little counters i.e. k = k +1 have many references to
them, up tp 1+.
Is there a way to free them?

regards
Sonja

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory leak problem with arrays

2006-06-14 Thread sonjaa

Serge Orlov wrote:
> sonjaa wrote:
> > Serge Orlov wrote:
> > > sonjaa wrote:
> > > > Hi
> > > >
> > > > I'm new to programming in python and I hope that this is the problem.
> > > >
> > > > I've created a cellular automata program in python with the numpy array
> > > > extensions. After each cycle/iteration the memory used to examine and
> > > > change the array as determined by the transition rules is never freed.
> > > > I've tried using "del" on every variable possible, but that hasn't
> > > > worked.
> > >
> > > Python keeps track of number of references to every object if the
> > > object has more that one reference by the time you use "del" the object
> > > is not freed, only number of references is decremented.
> > >
> > > Print the number of references for all the objects you think should be
> > > freed after each cycle/iteration, if is not equal 2 that means you are
> > > holding extra references to those objects. You can get the number of
> > > references to any object by calling sys.getrefcount(obj)
> >
> > thanks for the info. I used this several variables/objects and
> > discovered that little counters i.e. k = k +1 have many references to
> > them, up tp 1+.
> > Is there a way to free them?
>
> Although it's looks suspicious, even if you manage to free it you will
> gain only 12 bytes. I think you should concentrate on more fat
> objects ;)


Sent message to the NumPy forum as per Roberts suggestion.
An update after implimenting the suggestions:

After doing this I see that iterative counters used to collect
occurrences
and nested loop counters (ii & jj) as seen in the code example below
are the culprits with the worst ones over 1M:

 for ii in xrange(0,40):
for jj in xrange(0,20):
try:
nc = y[a+ii,b+jj]
except IndexError: nc = 0

if nc == "1" or nc == "5":
news = news +1
if news == 100:
break
else:
pass
y[a+ii,b+jj] = 4
else:
pass


The version of python I'm using is 2.4.3 and the version of NumPy is
0.9.8

thanks again for all the help
Sonja

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory leak problem with arrays

2006-06-15 Thread sonjaa

Fredrik Lundh wrote:
> <[EMAIL PROTECTED]> wrote:
> > After doing this I see that iterative counters used to collect occurrences
> > and nested loop counters (ii & jj) as seen in the code example below
> > are the culprits with the worst ones over 1M:
> >
> > for ii in xrange(0,40):
> >for jj in xrange(0,20):
> >try:
> >nc = y[a+ii,b+jj]
> >except IndexError: nc = 0
> >
> >if nc == "1" or nc == "5":
> >news = news +1
> >if news == 100:
> >break
> >else:
> >pass
> >y[a+ii,b+jj] = 4
> >else:
> >pass
> 
> what's "y" in this example ?
> 
> 

"y" is a 500x500 array.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory leak problem with arrays

2006-06-15 Thread sonjaa

Carl Banks wrote:
> sonjaa wrote:
> > I've created a cellular automata program in python with the numpy array
> > extensions. After each cycle/iteration the memory used to examine and
> > change the array as determined by the transition rules is never freed.
>
> Are you aware that slicing shares memory?  For example, say you defined
> a grid to do the automata calculations on, like this:
>
> grid = numpy.zeros([1000,1000])
>
> And then, after running it, you took a tiny slice as a region of
> interest, for example:
>
> roi = grid[10:20,10:20]
>
> Then deleted grid:
>
> del grid
>
> Then stored roi somewhere, for example:
>
> run_results.append(roi)
>
> If you do this, the memory for the original grid won't get freed.
> Although grid was deleted, roi still contains a reference to the whole
> 1000x1000 array, even though it's only a tiny slice of it.  Your poorly
> worded description--no offense--of what you did suggests that this is a
> possibility in your case.  I recommend you try to create a new array
> out of any slices you make, like this (but ONLY if the slice doesn't
> depend on the memory being shared):
>
> roi = numpy.array(grid[10:20,10:20])
>
> This time, when you del grid, there is no object left referencing the
> array data, so it'll be freed.
>
> This might not be your problem.  Details are important when asking
> questions, and so far you've only given us enough to speculate with.
>
> Carl Banks

I believe I understand your post. I don't think I was slicing the
array, I was only changing the values of the array.

I will try your suggestion and let you know how it goes

thanks
Sonja

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory leak problem with arrays

2006-06-20 Thread sonjaa
Hi Fredrik

the array was created by reading in values from a ascii file.

also, I've implemented the suggestions, but nothing has worked to date.
And yes, I have enough memory for one iteration. The app usually runs
out of memory around the 12th iteration.

Also, I can send a working version of the app, and the two associated
ascii files, if anyone is interested.

-Sonja


Fredrik Lundh wrote:
> sonjaa wrote:
>
> > "y" is a 500x500 array.
> 
> a 500x500 array of what ?  how did you create the array ?
> 
> 

-- 
http://mail.python.org/mailman/listinfo/python-list


Update on Memory problem with NumPy arrays

2006-06-21 Thread sonjaa
Hi

last week I posted a problem with running out of memory when changing
values in NumPy arrays. Since then I have tried many different
approaches and
work-arounds but to no avail.

I was able to reduce the code (see below) to its smallest size and
still
have the problem, albeit at a slower rate. The problem appears to come
from changing values in the array. Does this create another reference
to the
array,  which can't be released?

Also, are there other python methods/extensions that can create
multi-deminsional
arrays?

thanks again to those who repsonded to the last post
Sonja

PS. to watch the memory usage I just used task manager

the code:
from numpy import *

y = ones((501,501))
z = zeros((501,501))
it = 50

for kk in xrange(it):
y[1,1] = 4
y[1,2] = 4
y[1,0] = 4
y[2,1] = 6

print "Iteration #:%s" %(kk)
for ee in xrange(0,501):
for ff in xrange(0,501):
if y[ee,ff] == 4 or y[ee,ff] == 6:
y[ee,ff] = 2
else:
pass

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Update on Memory problem with NumPy arrays

2006-06-21 Thread sonjaa
I've been in contact with Travis O, and he said it was fixed in the
SVN.
thanks for the suggestions, I'll try them out now.

best
Sonja


Filip Wasilewski wrote:
> sonjaa wrote:
> > Hi
> >
> > last week I posted a problem with running out of memory when changing
> > values in NumPy arrays. Since then I have tried many different
> > approaches and
> > work-arounds but to no avail.
> [...]
>
> Based on the numpy-discussion this seems to be fixed in the SVN now(?).
>
> Anyway, you can use 'where' function to eliminate the loops:
>
> from numpy import *
>
> y = ones((501,501))
> z = zeros((501,501))
> it = 50
>
> for kk in xrange(it):
> y[1,1] = 4
> y[1,2] = 4
> y[1,0] = 4
> y[2,1] = 6
>
> print "Iteration #:%s" %(kk)
> y = where((y == 4) | (y == 6), 2, y) 
> 
> 
> best,
> fw

-- 
http://mail.python.org/mailman/listinfo/python-list