I had only to reach that count of browser request :-D
OK. Finally I have it :)
It was only the amount of memory it was consumed each time. I tried it with
a smaller but a significant number of records and finally I figured out
that there is a limit and after that memory consumption is ok. I had only
reach that count of browser request :-D
Sorry
The problem is not that if I call the controller function if consumes
memory. My problem is that there is something that still reference
something after the execution is finished so the consumed memory never get
released / reused...
2012. május 10., csütörtök 23:27:32 UTC+2 időpontban Martin.Mu
The problem is fetchall(), is getting all the records to memory.
If you want to iterate a large dataset or not so heavy, but have a table
with many fields, perhaps you can do it by blocks.
For example using raw sql with dal: example: this fetch by 1000 records
queryraw = "select * from mytable %(
Could it be related?
https://groups.google.com/forum/#!topic/web2py/hmsupVHdDHo/discussion (Memory
leak in standalone DAL (issue #731), can you please help test?)
2012. május 10., csütörtök 22:32:53 UTC+2 időpontban szimszon a következőt
írta:
>
> Okay. It's clear.
>
> I'm only puzzled about w
Okay. It's clear.
I'm only puzzled about why the memory didn't get freed or reused after
execution is finished. And if I execute the controller function in 1-2min
interval mem is still not reused.
So I understand it can eat up the memory but why is all memory locked
forever and didn't get reus
Ok, you don't need it to works all the time.
Did you get it to update your records?
If not, and if as you said it is a one trip, you can just treat the whole
records batch by batch...
Look here :
http://web2py.com/books/default/chapter/29/14#Populating-database-with-dummy-data
for i in range(10
I had to store files and a lot of properties for it. It was in csv. But
after I processed it we figured out that not all value was correct in csv
but it was a bit redundant. So I can correct it by go through all the
records row by row. So that was a one time trip.
I just realized after the proc
Yes but in this case it is not for the entire records...
Why would you return a full list of all the records?
I don't understand what is the purpose of listar that you return in the
view under a html table, why do you need to return all the 10+ entries?
Richard
On Thu, May 10, 2012 at 2:56
I didn't read enough your logic, but since it was implicated a other table
I thougth you just want to do a kind of computed field.
Also, what you seems to do is a kind of paging fonction, why you can't
achieve this with count()?
Richard
On Thu, May 10, 2012 at 2:30 PM, szimszon wrote:
> Sorry
In book it is a recommended way to iterate over sql results:
http://web2py.com/books/default/chapter/29/6
You can do all the steps in one statement:
1.
2.
3.
>>> for row in db(db.person.name=='Alex').select():
print row.name
Alex
2012. május 10., csütörtök 20:42:22 UTC+2 időpontban
Is there a doc already known by you explaining this? Can you post a link?
2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a
következőt írta:
>
> Sorry, you really need to read more about how python works. If you learn
> how for loops work and memory you will understand the proble
Sorry, you really need to read more about how python works. If you learn
how for loops work and memory you will understand the problem.
One solution do the query before the for loop then loop through the
objects. This may help at bit. Research xrange vs range
On Thu, May 10, 2012 at 11:30 AM, sz
Sorry I don't understand. What do you mean "achieve with join"?
There is an empty for loop with db.executesql() without join. And it is
eating up the memory. :(
2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt
írta:
>
> You can't manage what you want to achieve with joi
You can't manage what you want to achieve with join?
Richard
On Thu, May 10, 2012 at 10:48 AM, szimszon wrote:
> Sorry for my dumbness but if something is wrong with my code please point
> me the right line. I'm not so good in English if it comes to "object
> instance count" and so. Yeah I know
Sorry for my dumbness but if something is wrong with my code please point
me the right line. I'm not so good in English if it comes to "object
instance count" and so. Yeah I know I should go and do some milkmaid job :)
but I'm curious.
I'm just define some variable:
lista = list()
last_row = No
Using direct sql query or DAL is going to cause the exact same problem in
this situation.
On Thu, May 10, 2012 at 7:28 AM, szimszon wrote:
> It's postgres:// in a
> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>
> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
> [GCC 4.6.3] on linux2
>
It's postgres:// in a
Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
Python 2.7.3 (default, Apr 20 2012, 22:44:07)
[GCC 4.6.3] on linux2
python-psycopg2 2.4.5-1
2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
következőt írta:
>
> Just for curiosity, what happens if
I reduced the code in controller to:
def autoadjust():
lista = list()
last_row = None
next_page_number = 0
for row in db.executesql( "select * from file_properties where id > 0"
):
pass
lista = TABLE( *lista )
return dict( lista = lista )
And I still have memleak
Just for curiosity, what happens if you do it in pure sql?
for row in db.executesql("select * from file_properties where id > 0"):
# do something
Does it have a lower memory usage?
On Thu, May 10, 2012 at 4:14 AM, Bruce Wade wrote:
> for row in db( db.file_properties.id > 0 ).select(
In your for loop, every time you iterate the object instance count is
increased. Now if you call that loop again before python garbage collector
has time to release memory then your for loop will cause more instances to
be increased. Python will not release memory to the os for an object until
ever
Their is no grantee that python will use that exact same memory. It would
also depend how frequently you use that function.
http://mg.pov.lt/blog/hunting-python-memleaks.html
On Thu, May 10, 2012 at 12:57 AM, szimszon wrote:
> If I understand well then python release the mem. for reuse by itsel
If I understand well then python release the mem. for reuse by itself but
not releasing at OS level. Fine.
But then if I rerun the function I'll expect that the mem allocated for
python is not growing. Because python reuse it.
I executed the controller function and my mem usage at OS level 2x a
http://effbot.org/pyfaq/why-doesnt-python-release-the-memory-when-i-delete-a-large-object.htm
That is a little more details
On Thu, May 10, 2012 at 12:37 AM, szimszon wrote:
> Should the garbage collector not free up the memory?
>
> 2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade
Should the garbage collector not free up the memory?
2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a következőt
írta:
>
> That is how python is. If you want something to clear the memory as soon
> as you are done with it you need C++ :D
>
> On Thu, May 10, 2012 at 12:27 AM, szim
That is how python is. If you want something to clear the memory as soon as
you are done with it you need C++ :D
On Thu, May 10, 2012 at 12:27 AM, szimszon wrote:
> Yes I know but it happens over type the mem usage is linearly growing and
> after the successful execution never released and that
Yes I know but it happens over type the mem usage is linearly growing and
after the successful execution never released and that is why I ask :(
2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a következőt
írta:
>
> WOW not a good idea:
> for row in db( db.file_properties.id > 0
WOW not a good idea:
for row in db( db.file_properties.id > 0 ).select(
If you have a lot of records that is going to kill your memory.
On Thu, May 10, 2012 at 12:10 AM, szimszon wrote:
> I wonder if somebody could help me.
>
> The following code has eaten up ~1,5GB ram and after ended successf
28 matches
Mail list logo