Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread dieter
Giorgos Tzampanakis writes: > ... > So it seems that the pickle module does keep some internal cache or > something like that. This is highly unlikely: the "ZODB" (Zope object database) uses pickle (actually, it is "cPickle", the "C" implementation of the "pickle" module) for serialization. The "

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
On 2013-06-15, Peter Otten wrote: > Giorgos Tzampanakis wrote: > >> So it seems that the pickle module does keep some internal cache or >> something like that. > > I don't think there's a global cache. The Pickler/Unpickler has a per- > instance cache (the memo dict) that you can clear with the c

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Peter Otten
Giorgos Tzampanakis wrote: > So it seems that the pickle module does keep some internal cache or > something like that. I don't think there's a global cache. The Pickler/Unpickler has a per- instance cache (the memo dict) that you can clear with the clear_memo() method, but that doesn't matter

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
On 2013-06-15, Dave Angel wrote: > On 06/14/2013 07:04 PM, Giorgos Tzampanakis wrote: >> I have a program that saves lots (about 800k) objects into a shelve >> database (I'm using sqlite3dbm for this since all the default python dbm >> packages seem to be unreliable and effectively unusable, but t

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
On 2013-06-15, Peter Otten wrote: > Giorgos Tzampanakis wrote: > >> I have a program that saves lots (about 800k) objects into a shelve >> database (I'm using sqlite3dbm for this since all the default python dbm >> packages seem to be unreliable and effectively unusable, but this is >> another dis

Re: Memory usage steadily going up while pickling objects

2013-06-14 Thread Peter Otten
Giorgos Tzampanakis wrote: > I have a program that saves lots (about 800k) objects into a shelve > database (I'm using sqlite3dbm for this since all the default python dbm > packages seem to be unreliable and effectively unusable, but this is > another discussion). > > The process takes about 10-

Re: Memory usage steadily going up while pickling objects

2013-06-14 Thread Dave Angel
On 06/14/2013 07:04 PM, Giorgos Tzampanakis wrote: I have a program that saves lots (about 800k) objects into a shelve database (I'm using sqlite3dbm for this since all the default python dbm packages seem to be unreliable and effectively unusable, but this is another discussion). The process ta

Re: Memory usage per top 10x usage per heapy

2012-09-27 Thread bryanjugglercryptographer
MrsEntity wrote: > Based on heapy, a db based solution would be serious overkill. I've embraced overkill and my life is better for it. Don't confuse overkill with cost. Overkill is your friend. The facts of the case: You need to save some derived strings for each of 2M input lines. Even half th

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 26 September 2012 00:35, Tim Chase wrote: > On 09/25/12 17:55, Oscar Benjamin wrote: > > On 25 September 2012 23:10, Tim Chase > wrote: > >> If tuples provide a savings but you find them opaque, you might also > >> consider named-tuples for clarity. > > > > Do they have the same memory usage?

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/25/12 17:55, Oscar Benjamin wrote: > On 25 September 2012 23:10, Tim Chase wrote: >> If tuples provide a savings but you find them opaque, you might also >> consider named-tuples for clarity. > > Do they have the same memory usage? > > Since tuples don't have a per-instance __dict__, I'd e

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 23:10, Tim Chase wrote: > On 09/25/12 16:17, Oscar Benjamin wrote: > > I don't know whether it would be better or worse but it might be > > worth seeing what happens if you replace the FileContext objects > > with tuples. > > If tuples provide a savings but you find them opaq

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 23:09, Ian Kelly wrote: > On Tue, Sep 25, 2012 at 12:17 PM, Oscar Benjamin > wrote: > > Also I think lambda functions might be able to keep the frame alive. Are > > they by any chance being created in a function that is called in a loop? > > I'm pretty sure they don't. Clos

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Ian Kelly
On Tue, Sep 25, 2012 at 12:17 PM, Oscar Benjamin wrote: > Also I think lambda functions might be able to keep the frame alive. Are > they by any chance being created in a function that is called in a loop? I'm pretty sure they don't. Closures don't keep a reference to the calling frame, only to

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/25/12 16:17, Oscar Benjamin wrote: > I don't know whether it would be better or worse but it might be > worth seeing what happens if you replace the FileContext objects > with tuples. If tuples provide a savings but you find them opaque, you might also consider named-tuples for clarity. -tk

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 2:17 PM, Oscar Benjamin wrote: I don't know whether it would be better or worse but it might be worth seeing what happens if you replace the FileContext objects with tuples. I originally used a string, and it was slightly better since you don't have the object overhead, but I wanted

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 21:26, Junkshops wrote: > On 9/25/2012 11:17 AM, Oscar Benjamin wrote: > > On 25 September 2012 19:08, Junkshops wrote: > >> >> In [38]: mpef._ustore._store >> Out[38]: defaultdict(, {'Measurement': >> {'8991c2dc67a49b909918477ee4efd767': >> , >> '7b38b429230f00fe4731e60419

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 11:50 AM, Dave Angel wrote: I suspect that heapy has some limitation in its reporting, and that's what the discrepancy. That would be my first suspicion as well - except that heapy's results agree so well with what I expect, and I can't think of any reason I'd be using 10x more m

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 11:17 AM, Oscar Benjamin wrote: On 25 September 2012 19:08, Junkshops > wrote: In [38]: mpef._ustore._store Out[38]: defaultdict(, {'Measurement': {'8991c2dc67a49b909918477ee4efd767': , '7b38b429230f00fe4731e60419e92346': , 'b53531471

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Dave Angel
On 09/25/2012 01:39 PM, Junkshops wrote: Procedural point: I know you're trying to conform to the standard that this mailing list uses, but you're off a little, and it's distracting. It's also probably more work for you, and certainly for us. You need an attribution in front of the quoted portio

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 19:08, Junkshops wrote: > > Can you give an example of how these data structures look after reading > only the first 5 lines? > > Sure, here you go: > > In [38]: mpef._ustore._store > Out[38]: defaultdict(, {'Measurement': > {'8991c2dc67a49b909918477ee4efd767': > , > '7b38b4

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
Can you give an example of how these data structures look after reading only the first 5 lines? Sure, here you go: In [38]: mpef._ustore._store Out[38]: defaultdict(, {'Measurement': {'8991c2dc67a49b909918477ee4efd767': , '7b38b429230f00fe4731e60419e92346': , 'b53531471b261c44d52f651add64

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
I'm a bit surprised you aren't beyond the 2gb limit, just with the structures you describe for the file. You do realize that each object has quite a few bytes of overhead, so it's not surprising to use several times the size of a file, to store the file in an organized way. I did some back of the

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 00:58, Junkshops wrote: > Hi Tim, thanks for the response. > > > - check how you're reading the data: are you iterating over >>the lines a row at a time, or are you using >>.read()/.readlines() to pull in the whole file and then >>operate on that? >> > I'm using

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Mark Lawrence
On 25/09/2012 11:51, Tim Chase wrote: [snip] If only other unnamed persons on the list were so gracious rather than turning the flame-dial to 11. Oh heck what have I said this time? -tkc -- Cheers. Mark Lawrence. -- http://mail.python.org/mailman/listinfo/python-list

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Dave Angel
On 09/25/2012 12:21 AM, Junkshops wrote: >> Just curious; which is it, two million lines, or half a million bytes? > > Sorry, that should've been a 500Mb, 2M line file. > >> which machine is 2gb, the Windows machine, or the VM? > VM. Winders is 4gb. > >> ...but I would point out that just beca

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/24/12 23:41, Dennis Lee Bieber wrote: > On Mon, 24 Sep 2012 14:59:47 -0700 (PDT), MrsEntity > declaimed the following in > gmane.comp.python.general: > >> Hi all, >> >> I'm working on some code that parses a 500kb, 2M line file line by line and >> saves, per line, some derived strings > >

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Junkshops
Just curious; which is it, two million lines, or half a million bytes? I have, in fact, this very afternoon, invented a means of writing a carriage return character using only 2 bits of information. I am prepared to sell licenses to this revolutionary technology for the low price of $29.95 plu

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Dave Angel
On 09/24/2012 05:59 PM, MrsEntity wrote: > Hi all, > > I'm working on some code that parses a 500kb, 2M line file Just curious; which is it, two million lines, or half a million bytes? > line by line and saves, per line, some derived strings into various data > structures. I thus expect that m

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Junkshops
Hi Tim, thanks for the response. - check how you're reading the data: are you iterating over the lines a row at a time, or are you using .read()/.readlines() to pull in the whole file and then operate on that? I'm using enumerate() on an iterable input (which in this case is the fileh

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Tim Chase
On 09/24/12 16:59, MrsEntity wrote: > I'm working on some code that parses a 500kb, 2M line file line > by line and saves, per line, some derived strings into various > data structures. I thus expect that memory use should > monotonically increase. Currently, the program is taking up so > much memo

Re: memory usage multi value hash

2011-04-15 Thread Peter Otten
Terry Reedy wrote: > On 4/14/2011 12:55 PM, Peter Otten wrote: > >> I don't expect that it matters much, but you don't need to sort your data >> if you use a dictionary anyway: > > Which means that one can build the dict line by line, as each is read, > instead of reading the entire file into me

Re: memory usage multi value hash

2011-04-15 Thread Algis Kabaila
On Friday 15 April 2011 02:13:51 christian wrote: > Hello, > > i'm not very experienced in python. Is there a way doing > below more memory efficient and maybe faster. > I import a 2-column file and then concat for every unique > value in the first column ( key) the value from the second > colum

Re: memory usage multi value hash

2011-04-14 Thread Terry Reedy
On 4/14/2011 12:55 PM, Peter Otten wrote: I don't expect that it matters much, but you don't need to sort your data if you use a dictionary anyway: Which means that one can build the dict line by line, as each is read, instead of reading the entire file into memory. So it does matter for int

Re: memory usage multi value hash

2011-04-14 Thread Peter Otten
christian wrote: > Hello, > > i'm not very experienced in python. Is there a way doing below more > memory efficient and maybe faster. > I import a 2-column file and then concat for every unique value in > the first column ( key) the value from the second > columns. > > So The ouptut is someth

Re: Memory Usage of Strings

2011-03-16 Thread Amit Dev
Thanks Dan for the detailed reply. I suspect it is related to FreeBSD malloc/free as you suggested. Here is the output of running your script: [16-bsd01 ~/work]$ python strm.py --first USERPID %CPU %MEM VSZ RSS TT STAT STARTED TIME COMMAND amdev 6899 3.0 6.9 111944 107560 p0 S+

Re: Memory Usage of Strings

2011-03-16 Thread Dan Stromberg
On Wed, Mar 16, 2011 at 8:38 AM, Amit Dev wrote: > I'm observing a strange memory usage pattern with strings. Consider > the following session. Idea is to create a list which holds some > strings so that cumulative characters in the list is 100MB. > > >>> l = [] > >>> for i in xrange(10): > .

Re: Memory Usage of Strings

2011-03-16 Thread Terry Reedy
On 3/16/2011 3:51 PM, Santoso Wijaya wrote: ?? Python 2.7.1 (r271:86832, Nov 27 2010, 17:19:03) [MSC v.1500 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> L = [] >>> for i in xrange(10): ... L.append(str(i) * (1000 / len(

Re: Memory Usage of Strings

2011-03-16 Thread eryksun ()
On Wednesday, March 16, 2011 2:20:34 PM UTC-4, Amit Dev wrote: > > sum(map(len, l)) => 8200 for 1st case and 9100 for 2nd case. > Roughly 100MB as I mentioned. The two lists used approximately the same memory in my test with Python 2.6.6 on Windows. An implementation detail such as this

Re: Memory Usage of Strings

2011-03-16 Thread Santoso Wijaya
?? Python 2.7.1 (r271:86832, Nov 27 2010, 17:19:03) [MSC v.1500 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> L = [] >>> for i in xrange(10): ... L.append(str(i) * (1000 / len(str(i ... >>> sys.getsizeof(L) 824464 >>> L =

Re: Memory Usage of Strings

2011-03-16 Thread Amit Dev
sum(map(len, l)) => 8200 for 1st case and 9100 for 2nd case. Roughly 100MB as I mentioned. On Wed, Mar 16, 2011 at 11:21 PM, John Gordon wrote: > In Amit Dev > writes: > >> I'm observing a strange memory usage pattern with strings. Consider >> the following session. Idea is to create

Re: Memory Usage of Strings

2011-03-16 Thread John Gordon
In Amit Dev writes: > I'm observing a strange memory usage pattern with strings. Consider > the following session. Idea is to create a list which holds some > strings so that cumulative characters in the list is 100MB. > >>> l = [] > >>> for i in xrange(10): > ... l.append(str(i) * (1000/

Re: memory usage, temporary and otherwise

2010-03-04 Thread Steve Holden
Duncan Booth wrote: > mk wrote: > >> Hm, apparently Python didn't spot that 'spam'*10 in a's values is really >> the same string, right? > > If you want it to spot that then give it a hint that it should be looking > for identical strings: > > >>> a={} > >>> for i in range(1000): > ...

Re: memory usage, temporary and otherwise

2010-03-04 Thread Terry Reedy
On 3/4/2010 6:56 AM, mk wrote: Bruno Desthuilliers wrote: Huh? I was under impression that some time after 2.0 range was made to work "under the covers" like xrange when used in a loop? Or is it 3.0 that does that? 3.0. -- http://mail.python.org/mailman/listinfo/python-list

Re: memory usage, temporary and otherwise

2010-03-04 Thread lbolla
On Mar 4, 12:24 pm, Duncan Booth wrote: > >  >>> a={} >  >>> for i in range(1000): > ...     a[i]=intern('spam'*10) > "intern": another name borrowed from Lisp? -- http://mail.python.org/mailman/listinfo/python-list

Re: memory usage, temporary and otherwise

2010-03-04 Thread Duncan Booth
mk wrote: > Hm, apparently Python didn't spot that 'spam'*10 in a's values is really > the same string, right? If you want it to spot that then give it a hint that it should be looking for identical strings: >>> a={} >>> for i in range(1000): ... a[i]=intern('spam'*10) should reduc

Re: memory usage, temporary and otherwise

2010-03-04 Thread mk
Bruno Desthuilliers wrote: mk a écrit : Obviously, don't try this on low-memory machine: a={} for i in range(1000): Note that in Python 2, this will build a list of 1000 int objects. You may want to use xrange instead... Huh? I was under impression that some time after 2.0 range w

Re: memory usage, temporary and otherwise

2010-03-03 Thread Bruno Desthuilliers
Bruno Desthuilliers a écrit : > mk a écrit : (snip) >> So sys.getsizeof returns some 200MB for this dictionary. But according >> to top RSS of the python process is 300MB. ps auxw says the same thing >> (more or less). >> >> Why the 50% overhead? Oh, and yes - the interpreter itself, the builtins

Re: memory usage, temporary and otherwise

2010-03-03 Thread Bruno Desthuilliers
mk a écrit : > > Obviously, don't try this on low-memory machine: > a={} for i in range(1000): Note that in Python 2, this will build a list of 1000 int objects. You may want to use xrange instead... > ... a[i]='spam'*10 > ... import sys sys.getsizeof(a) > 201326

Re: Memory usage problem of twisted server

2010-01-21 Thread Dieter Maurer
Victor Lin writes on Wed, 20 Jan 2010 02:52:25 -0800 (PST): > Hi, > > I encountered an increasing memory usage problem of my twisted server. > I have posted a question on stackoverflow: > http://stackoverflow.com/questions/2100192/how-to-find-the-source-of-increasing-memory-usage-of-a-twisted-ser

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread John Machin
Ant wrote: > > > Are you sure ps is reporting in bytes not KB? The bare interpreter in > > > Windows is 3368KB. > > > > Where did you get that from? With Python 2.4.3, on my machine (Win XP > > SP2): > > > > C:\junk>dir \python24\python* > > [snip] > > 29/03/2006 05:35 PM 4,608 python

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread Ant
> > Are you sure ps is reporting in bytes not KB? The bare interpreter in > > Windows is 3368KB. > > Where did you get that from? With Python 2.4.3, on my machine (Win XP > SP2): > > C:\junk>dir \python24\python* > [snip] > 29/03/2006 05:35 PM 4,608 python.exe > 29/03/2006 05:35 PM

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread John Machin
Ant wrote: > [EMAIL PROTECTED] wrote: > > I was wondering what the approximate amount of memory needed to load a > > Python interpreter (only, no objects, no scripts, no nothing else) in a > > Linux 2.6 environment. According to ps, it appears to be 3312 bytes, > > which seems absurdly low to me.

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread [EMAIL PROTECTED]
[EMAIL PROTECTED] wrote: > I was wondering what the approximate amount of memory needed to load a > Python interpreter (only, no objects, no scripts, no nothing else) in a > Linux 2.6 environment. According to ps, it appears to be 3312 bytes, > which seems absurdly low to me. Your spidey sense is

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread Ganesan Rajagopal
> neokosmos <[EMAIL PROTECTED]> writes: > I was wondering what the approximate amount of memory needed to load a > Python interpreter (only, no objects, no scripts, no nothing else) in a > Linux 2.6 environment. According to ps, it appears to be 3312 bytes, > which seems absurdly low to me.

Re: Memory usage of an 'empty' python interpreter

2006-08-16 Thread Ant
[EMAIL PROTECTED] wrote: > I was wondering what the approximate amount of memory needed to load a > Python interpreter (only, no objects, no scripts, no nothing else) in a > Linux 2.6 environment. According to ps, it appears to be 3312 bytes, > which seems absurdly low to me. However, when I che

Re: memory usage of a specific function

2006-01-05 Thread Stephen Kellett
In message <[EMAIL PROTECTED]>, Sverker Nilsson <[EMAIL PROTECTED]> writes >> Python Memory Validator. >> >> Run your program to completion. >> Switch to the hotspots tab. >> Search for your function. >> All memory used in that function will be shown in the tree (with the >> effective callstack) un

Re: memory usage of a specific function

2006-01-04 Thread Sverker Nilsson
Stephen Kellett wrote: > In message <[EMAIL PROTECTED]>, > > Sverker Nilsson <[EMAIL PROTECTED]> writes > [Note that actually it was Hermann Maier that wrote the following but as quoted, it may look like it was I that wrote it.] > >> i need to find out the memory usage of a specific function th

Re: memory usage of a specific function

2006-01-04 Thread Stephen Kellett
In message <[EMAIL PROTECTED]>, Sverker Nilsson <[EMAIL PROTECTED]> writes >> i need to find out the memory usage of a specific function that i use in >> my program. this function does some recursive calculations and i want my >> program to display the amount of memory the function used to calcula

Re: memory usage of a specific function

2006-01-04 Thread Sverker Nilsson
Hermann Maier wrote: > hi, > > i need to find out the memory usage of a specific function that i use in > my program. this function does some recursive calculations and i want my > program to display the amount of memory the function used to calculate a > specific value. > > thx I was thinking th

Re: Memory Usage

2005-01-25 Thread Nick Coghlan
Stuart McGarrity wrote: Do you have a page file? The Mem column should be RAM usage and not total working set. Some of it could be swapped to the page file. A free tool like process explorer can give you better informaton than the task manager. As Tim pointed out, "View->Select Columns" and activ

Re: Memory Usage

2005-01-24 Thread Stephen Kellett
In message <[EMAIL PROTECTED]>, rbt <[EMAIL PROTECTED]> writes >That's right. I look at that column. Should I measue mem usage in some >other way? Try VM Validator, a free memory visualization tool from Software Verification. http://www.softwareverify.com http://www.softwareverify

Re: Memory Usage

2005-01-24 Thread Tim Peters
[<[EMAIL PROTECTED]>] > Would a Python process consume more memory on a PC with lots of > memory? > > For example, say I have the same Python script running on two WinXP > computers that both have Python 2.4.0. One computer has 256 MB of Ram > while the other has 2 GB of Ram. On the machine with le

Re: Memory Usage

2005-01-24 Thread Stuart McGarrity
Do you have a page file? The Mem column should be RAM usage and not total working set. Some of it could be swapped to the page file. A free tool like process explorer can give you better informaton than the task manager. "rbt" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] > Would

Re: Memory Usage

2005-01-24 Thread Peter Hansen
rbt wrote: Peter Hansen wrote: I would expect to see such behaviour, given how difficult it is to measure *actual* memory usage. How are you measuring it? Just by looking at the Mem Usage column in the Task Manager? That's right. I look at that column. Should I measue mem usage in some other way?

Re: Memory Usage

2005-01-24 Thread Fredrik Lundh
"rbt" wrote: > For example, say I have the same Python script running on two WinXP computers > that both have > Python 2.4.0. One computer has 256 MB of Ram while the other has 2 GB of Ram. > On the machine with > less Ram, the process takes about 1 MB of Ram. On the machine with more Ram, >

Re: Memory Usage

2005-01-24 Thread rbt
Peter Hansen wrote: rbt wrote: Would a Python process consume more memory on a PC with lots of memory? For example, say I have the same Python script running on two WinXP computers that both have Python 2.4.0. One computer has 256 MB of Ram while the other has 2 GB of Ram. On the machine with les

Re: Memory Usage

2005-01-24 Thread Peter Hansen
rbt wrote: Would a Python process consume more memory on a PC with lots of memory? For example, say I have the same Python script running on two WinXP computers that both have Python 2.4.0. One computer has 256 MB of Ram while the other has 2 GB of Ram. On the machine with less Ram, the process