Re: getting memory usage of varaibles

2017-05-03 Thread Ben Finney
Erik writes: > The thing about functions or classes is that you can't (at the literal > source level) define them *without* giving them a name: Even a function is commonly defined without giving it a name. >>> strategies = [ ... (lambda x: x + 2), ... (lambda x: x ** 3),

Re: getting memory usage of varaibles

2017-05-03 Thread Ben Finney
Larry Martell writes: > On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote: > > Coming right back to the beginning here: What do you expect the name > > of an object to be? > > The name of the variable in the program, e.g. sql, db_conn, rows, etc. That assumes that the object has exactly one

Re: getting memory usage of varaibles

2017-05-03 Thread Terry Reedy
On 5/3/2017 6:21 PM, Larry Martell wrote: On Wed, May 3, 2017 at 6:15 PM, Terry Reedy wrote: Python already uses this trick for functions, classes, and modules by giving them .__name__ attribute. Code objects have a .co_name attribute. These are used for tracing and tracebacks. I left out

Re: getting memory usage of varaibles

2017-05-03 Thread Chris Angelico
On Thu, May 4, 2017 at 10:32 AM, Ned Batchelder wrote: > On Wednesday, May 3, 2017 at 8:09:59 PM UTC-4, Steve D'Aprano wrote: >> On Thu, 4 May 2017 09:30 am, Ned Batchelder wrote: >> >> > Functions, classes, and modules can also be referred to by a number of >> > variables: >> > >> > def foo()

Re: getting memory usage of varaibles

2017-05-03 Thread Ned Batchelder
On Wednesday, May 3, 2017 at 8:09:59 PM UTC-4, Steve D'Aprano wrote: > On Thu, 4 May 2017 09:30 am, Ned Batchelder wrote: > > > Functions, classes, and modules can also be referred to by a number of > > variables: > > > > def foo(): pass > > bar = baz = foo > > > > But functions (by virt

Re: getting memory usage of varaibles

2017-05-03 Thread Steve D'Aprano
On Thu, 4 May 2017 09:30 am, Ned Batchelder wrote: > Functions, classes, and modules can also be referred to by a number of > variables: > > def foo(): pass > bar = baz = foo > > But functions (by virtue of the name in the def statement) have an > inherent name, Indeed; but we also hav

Re: getting memory usage of varaibles

2017-05-03 Thread Ned Batchelder
On Wednesday, May 3, 2017 at 6:22:28 PM UTC-4, larry@gmail.com wrote: > On Wed, May 3, 2017 at 6:15 PM, Terry Reedy wrote: > > On 5/3/2017 8:40 AM, Larry Martell wrote: > >> > >> On Wed, May 3, 2017 at 8:29 AM, Chris Angelico wrote: > >>> > >>> On Wed, May 3, 2017 at 10:12 PM, Larry Martell

Re: getting memory usage of varaibles

2017-05-03 Thread Erik
On 03/05/17 23:21, Larry Martell wrote: But not for a variable like a list or dict? What name should "[1, 2, 3]", or "{1, 'a': 2: 'b'}" be given? The thing about functions or classes is that you can't (at the literal source level) define them *without* giving them a name: def func(): pass c

Re: getting memory usage of varaibles

2017-05-03 Thread Larry Martell
On Wed, May 3, 2017 at 6:15 PM, Terry Reedy wrote: > On 5/3/2017 8:40 AM, Larry Martell wrote: >> >> On Wed, May 3, 2017 at 8:29 AM, Chris Angelico wrote: >>> >>> On Wed, May 3, 2017 at 10:12 PM, Larry Martell >>> wrote: On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote:

Re: getting memory usage of varaibles

2017-05-03 Thread Terry Reedy
On 5/3/2017 8:40 AM, Larry Martell wrote: On Wed, May 3, 2017 at 8:29 AM, Chris Angelico wrote: On Wed, May 3, 2017 at 10:12 PM, Larry Martell wrote: On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote: On Wed, May 3, 2017 at 5:53 AM, Larry Martell wrote: And I can see it getting larger

Re: getting memory usage of varaibles

2017-05-03 Thread Ned Batchelder
02/05/17 23:28, Larry Martell wrote: > >> >>>> > >> >>>> Anyone have any thoughts on how I can monitor the variables' memory > >> >>>> usage as the script runs? > >> >>> > >> >>> > >> >

Re: getting memory usage of varaibles

2017-05-03 Thread Larry Martell
On Wed, May 3, 2017 at 8:29 AM, Chris Angelico wrote: > On Wed, May 3, 2017 at 10:12 PM, Larry Martell > wrote: >> On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote: >>> On Wed, May 3, 2017 at 5:53 AM, Larry Martell >>> wrote: And I can see it getting larger and larger. But I want to

Re: getting memory usage of varaibles

2017-05-03 Thread Larry Martell
e have any thoughts on how I can monitor the variables' memory >> >>>> usage as the script runs? >> >>> >> >>> >> >>> This is application-specific, but sometimes it helps to look at the >> >>> objects' types, or

Re: getting memory usage of varaibles

2017-05-03 Thread Chris Angelico
On Wed, May 3, 2017 at 10:12 PM, Larry Martell wrote: > On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote: >> On Wed, May 3, 2017 at 5:53 AM, Larry Martell >> wrote: >>> And I can see it getting larger and larger. But I want to see what it >>> is that is causing this. My thought was to put

Re: getting memory usage of varaibles

2017-05-03 Thread Ned Batchelder
On Tuesday, May 2, 2017 at 11:49:37 PM UTC-4, larry@gmail.com wrote: > On Tue, May 2, 2017 at 7:01 PM, Erik wrote: > > On 02/05/17 23:28, Larry Martell wrote: > >>>> > >>>> Anyone have any thoughts on how I can monitor the variables&

Re: getting memory usage of varaibles

2017-05-03 Thread Larry Martell
On Wed, May 3, 2017 at 12:57 AM, Chris Angelico wrote: > On Wed, May 3, 2017 at 5:53 AM, Larry Martell wrote: >> And I can see it getting larger and larger. But I want to see what it >> is that is causing this. My thought was to put all the objects in a >> dict with their sizes and compare them a

Re: getting memory usage of varaibles

2017-05-02 Thread INADA Naoki
gt;>> Anyone have any thoughts on how I can monitor the variables' memory > >>>> usage as the script runs? > >>> > >>> > >>> This is application-specific, but sometimes it helps to look at the > >>> objects' type

Re: getting memory usage of varaibles

2017-05-02 Thread Chris Angelico
On Wed, May 3, 2017 at 5:53 AM, Larry Martell wrote: > And I can see it getting larger and larger. But I want to see what it > is that is causing this. My thought was to put all the objects in a > dict with their sizes and compare them as the program runs and report > on the one that are growing.

Re: getting memory usage of varaibles

2017-05-02 Thread Larry Martell
On Tue, May 2, 2017 at 7:01 PM, Erik wrote: > On 02/05/17 23:28, Larry Martell wrote: >>>> >>>> Anyone have any thoughts on how I can monitor the variables' memory >>>> usage as the script runs? >>> >>> >>> This is application

Re: getting memory usage of varaibles

2017-05-02 Thread Erik
On 02/05/17 23:28, Larry Martell wrote: Anyone have any thoughts on how I can monitor the variables' memory usage as the script runs? This is application-specific, but sometimes it helps to look at the objects' types, or even their values. The types are dict and list, so they ar

Re: getting memory usage of varaibles

2017-05-02 Thread Larry Martell
that is causing this. My thought was to put all the objects in a >> dict with their sizes and compare them as the program runs and report >> on the one that are growing. But I can't get the name of the object >> from gc.get_objects only the id. >> >> Anyone h

Re: getting memory usage of varaibles

2017-05-02 Thread breamoreboy
their sizes and compare them as the program runs and report > on the one that are growing. But I can't get the name of the object > from gc.get_objects only the id. > > Anyone have any thoughts on how I can monitor the variables' memory > usage as the script runs? H

Re: getting memory usage of varaibles

2017-05-02 Thread Dan Stromberg
ompare them as the program runs and report > on the one that are growing. But I can't get the name of the object > from gc.get_objects only the id. > > Anyone have any thoughts on how I can monitor the variables' memory > usage as the script runs? This is application-specif

getting memory usage of varaibles

2017-05-02 Thread Larry Martell
the object from gc.get_objects only the id. Anyone have any thoughts on how I can monitor the variables' memory usage as the script runs? -- https://mail.python.org/mailman/listinfo/python-list

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread dieter
Giorgos Tzampanakis writes: > ... > So it seems that the pickle module does keep some internal cache or > something like that. This is highly unlikely: the "ZODB" (Zope object database) uses pickle (actually, it is "cPickle", the "C" implementation of the "pickle" module) for serialization. The "

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
On 2013-06-15, Peter Otten wrote: > Giorgos Tzampanakis wrote: > >> So it seems that the pickle module does keep some internal cache or >> something like that. > > I don't think there's a global cache. The Pickler/Unpickler has a per- > instance cache (the memo dict) that you can clear with the c

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Peter Otten
Giorgos Tzampanakis wrote: > So it seems that the pickle module does keep some internal cache or > something like that. I don't think there's a global cache. The Pickler/Unpickler has a per- instance cache (the memo dict) that you can clear with the clear_memo() method, but that doesn't matter

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
effectively unusable, but this is >> another discussion). >> >> The process takes about 10-15 minutes. During that time I see memory usage >> steadily rising, sometimes resulting in a MemoryError. Now, there is a >> chance that my code is keeping unneeded references to

Re: Memory usage steadily going up while pickling objects

2013-06-15 Thread Giorgos Tzampanakis
sable, but this is >> another discussion). >> >> The process takes about 10-15 minutes. During that time I see memory usage >> steadily rising, sometimes resulting in a MemoryError. Now, there is a >> chance that my code is keeping unneeded references to the stored obje

Re: Memory usage steadily going up while pickling objects

2013-06-14 Thread Peter Otten
process takes about 10-15 minutes. During that time I see memory usage > steadily rising, sometimes resulting in a MemoryError. Now, there is a > chance that my code is keeping unneeded references to the stored objects, > but I have debugged it thoroughly and haven't found any. >

Re: Memory usage steadily going up while pickling objects

2013-06-14 Thread Dave Angel
ocess takes about 10-15 minutes. During that time I see memory usage steadily rising, sometimes resulting in a MemoryError. Now, there is a chance that my code is keeping unneeded references to the stored objects, but I have debugged it thoroughly and haven't found any. So I'm beginning

Memory usage steadily going up while pickling objects

2013-06-14 Thread Giorgos Tzampanakis
I see memory usage steadily rising, sometimes resulting in a MemoryError. Now, there is a chance that my code is keeping unneeded references to the stored objects, but I have debugged it thoroughly and haven't found any. So I'm beginning to suspect that the pickle module might be keeping an in

Re: Memory usage per top 10x usage per heapy

2012-09-27 Thread bryanjugglercryptographer
MrsEntity wrote: > Based on heapy, a db based solution would be serious overkill. I've embraced overkill and my life is better for it. Don't confuse overkill with cost. Overkill is your friend. The facts of the case: You need to save some derived strings for each of 2M input lines. Even half th

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 26 September 2012 00:35, Tim Chase wrote: > On 09/25/12 17:55, Oscar Benjamin wrote: > > On 25 September 2012 23:10, Tim Chase > wrote: > >> If tuples provide a savings but you find them opaque, you might also > >> consider named-tuples for clarity. > > &g

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/25/12 17:55, Oscar Benjamin wrote: > On 25 September 2012 23:10, Tim Chase wrote: >> If tuples provide a savings but you find them opaque, you might also >> consider named-tuples for clarity. > > Do they have the same memory usage? > > Since tuples don't h

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
vide a savings but you find them opaque, you might also > consider named-tuples for clarity. > Do they have the same memory usage? Since tuples don't have a per-instance __dict__, I'd expect them to be a lot lighter. I'm not sure if I'm interpreting the results below prop

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 23:09, Ian Kelly wrote: > On Tue, Sep 25, 2012 at 12:17 PM, Oscar Benjamin > wrote: > > Also I think lambda functions might be able to keep the frame alive. Are > > they by any chance being created in a function that is called in a loop? > > I'm pretty sure they don't. Clos

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Ian Kelly
On Tue, Sep 25, 2012 at 12:17 PM, Oscar Benjamin wrote: > Also I think lambda functions might be able to keep the frame alive. Are > they by any chance being created in a function that is called in a loop? I'm pretty sure they don't. Closures don't keep a reference to the calling frame, only to

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/25/12 16:17, Oscar Benjamin wrote: > I don't know whether it would be better or worse but it might be > worth seeing what happens if you replace the FileContext objects > with tuples. If tuples provide a savings but you find them opaque, you might also consider named-tuples for clarity. -tk

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 2:17 PM, Oscar Benjamin wrote: I don't know whether it would be better or worse but it might be worth seeing what happens if you replace the FileContext objects with tuples. I originally used a string, and it was slightly better since you don't have the object overhead, but I wanted

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 21:26, Junkshops wrote: > On 9/25/2012 11:17 AM, Oscar Benjamin wrote: > > On 25 September 2012 19:08, Junkshops wrote: > >> >> In [38]: mpef._ustore._store >> Out[38]: defaultdict(, {'Measurement': >> {'8991c2dc67a49b909918477ee4efd767': >> , >> '7b38b429230f00fe4731e60419

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 11:50 AM, Dave Angel wrote: I suspect that heapy has some limitation in its reporting, and that's what the discrepancy. That would be my first suspicion as well - except that heapy's results agree so well with what I expect, and I can't think of any reason I'd be using 10x more m

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
On 9/25/2012 11:17 AM, Oscar Benjamin wrote: On 25 September 2012 19:08, Junkshops > wrote: In [38]: mpef._ustore._store Out[38]: defaultdict(, {'Measurement': {'8991c2dc67a49b909918477ee4efd767': , '7b38b429230f00fe4731e60419e92346': , 'b53531471

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Dave Angel
On 09/25/2012 01:39 PM, Junkshops wrote: Procedural point: I know you're trying to conform to the standard that this mailing list uses, but you're off a little, and it's distracting. It's also probably more work for you, and certainly for us. You need an attribution in front of the quoted portio

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 19:08, Junkshops wrote: > > Can you give an example of how these data structures look after reading > only the first 5 lines? > > Sure, here you go: > > In [38]: mpef._ustore._store > Out[38]: defaultdict(, {'Measurement': > {'8991c2dc67a49b909918477ee4efd767': > , > '7b38b4

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
Can you give an example of how these data structures look after reading only the first 5 lines? Sure, here you go: In [38]: mpef._ustore._store Out[38]: defaultdict(, {'Measurement': {'8991c2dc67a49b909918477ee4efd767': , '7b38b429230f00fe4731e60419e92346': , 'b53531471b261c44d52f651add64

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Junkshops
I'm a bit surprised you aren't beyond the 2gb limit, just with the structures you describe for the file. You do realize that each object has quite a few bytes of overhead, so it's not surprising to use several times the size of a file, to store the file in an organized way. I did some back of the

Re: gracious responses (was: Memory usage per top 10x usage per heapy)

2012-09-25 Thread alex23
On Sep 25, 9:39 pm, Tim Chase wrote: > Mostly instigated by one person with a > particularly quick trigger, vitriolic tongue, and a disregard for > pythonic code. I'm sorry. I'll get me coat. -- http://mail.python.org/mailman/listinfo/python-list

Re: gracious responses (was: Memory usage per top 10x usage per heapy)

2012-09-25 Thread Tim Chase
On 09/25/12 06:10, Mark Lawrence wrote: > On 25/09/2012 11:51, Tim Chase wrote: >> If only other unnamed persons on the list were so gracious rather >> than turning the flame-dial to 11. >> > > Oh heck what have I said this time? You'd *like* to take credit? ;-) Nah, not you or any of the regul

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Oscar Benjamin
On 25 September 2012 00:58, Junkshops wrote: > Hi Tim, thanks for the response. > > > - check how you're reading the data: are you iterating over >>the lines a row at a time, or are you using >>.read()/.readlines() to pull in the whole file and then >>operate on that? >> > I'm using

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Mark Lawrence
On 25/09/2012 11:51, Tim Chase wrote: [snip] If only other unnamed persons on the list were so gracious rather than turning the flame-dial to 11. Oh heck what have I said this time? -tkc -- Cheers. Mark Lawrence. -- http://mail.python.org/mailman/listinfo/python-list

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Dave Angel
On 09/25/2012 12:21 AM, Junkshops wrote: >> Just curious; which is it, two million lines, or half a million bytes? > > Sorry, that should've been a 500Mb, 2M line file. > >> which machine is 2gb, the Windows machine, or the VM? > VM. Winders is 4gb. > >> ...but I would point out that just beca

Re: Memory usage per top 10x usage per heapy

2012-09-25 Thread Tim Chase
On 09/24/12 23:41, Dennis Lee Bieber wrote: > On Mon, 24 Sep 2012 14:59:47 -0700 (PDT), MrsEntity > declaimed the following in > gmane.comp.python.general: > >> Hi all, >> >> I'm working on some code that parses a 500kb, 2M line file line by line and >> saves, per line, some derived strings > >

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Junkshops
ing on, what additional data would you like? I didn't want swamp everyone with the code and heapy/memory_profiler output but I can do so if it's valuable. 2) How can I diagnose (and hopefully fix) what's causing the massive memory usage when it appears, from heapy, that the code

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Dave Angel
> > 1) For those of you kind enough to help me figure out what's going on, what > additional data would you like? I didn't want swamp everyone with the code > and heapy/memory_profiler output but I can do so if it's valuable. > 2) How can I diagnose (and hopefull

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Junkshops
arily persist the large quantity of data out to disk in order to keep memory usage lower? That's the thing though - according to heapy, the memory usage *is* low and is more or less what I expect. What I don't understand is why top is reporting such vastly different memory usage. If a m

Re: Memory usage per top 10x usage per heapy

2012-09-24 Thread Tim Chase
can I diagnose (and hopefully fix) what's causing the > massive memory usage when it appears, from heapy, that the code > is performing reasonably? I seem to recall that Python holds on to memory that the VM releases, but that it *should* reuse it later. So you'd get the symptom of the

Memory usage per top 10x usage per heapy

2012-09-24 Thread MrsEntity
estions are: 1) For those of you kind enough to help me figure out what's going on, what additional data would you like? I didn't want swamp everyone with the code and heapy/memory_profiler output but I can do so if it's valuable. 2) How can I diagnose (and hopefully fix) what's c

Re: monotonically increasing memory usage

2011-07-29 Thread Ulrich Eckhardt
Pedro Larroy wrote: > Just crossposting this from stackoverflow: > > http://stackoverflow.com/... > > Any hints? At first I was just too lazy to visit stackoverflow and skipped this posting. Then I thought: Why didn't you include the content, so people can actually answer this question here? T

Re: monotonically increasing memory usage

2011-07-28 Thread Nobody
On Thu, 28 Jul 2011 11:52:25 +0200, Pedro Larroy wrote: > pickling > > Just crossposting this from stackoverflow: > > http://stackoverflow.com/questions/6857006/ > > Any hints? AFAIK, it's because the Pickler object keeps a reference to each object so that pointer-sharing works; if you write t

monotonically increasing memory usage

2011-07-28 Thread Pedro Larroy
Hi pickling Just crossposting this from stackoverflow: http://stackoverflow.com/questions/6857006/python-monotonically-increasing-memory-usage-leak Any hints? Pedro. -- Pedro Larroy Tovar   |    http://pedro.larroy.com/ -- http://mail.python.org/mailman/listinfo/python-list

Re: memory usage multi value hash

2011-04-15 Thread Peter Otten
Terry Reedy wrote: > On 4/14/2011 12:55 PM, Peter Otten wrote: > >> I don't expect that it matters much, but you don't need to sort your data >> if you use a dictionary anyway: > > Which means that one can build the dict line by line, as each is read, > instead of reading the entire file into me

Re: memory usage multi value hash

2011-04-15 Thread Algis Kabaila
is taken as follows: A, 1 B, 3 C, 9 A, 2 B, 4 C, 10 A, 3 C, 11 C, 12 C, 90 C, 34 C, 322 C, 21 The "two in one" program is: #!/usr/bin python '''generate.py - Example of reading long two column csv list and sorting. Thread "memory usage multi value hash" '&#x

Re: memory usage multi value hash

2011-04-14 Thread Terry Reedy
On 4/14/2011 12:55 PM, Peter Otten wrote: I don't expect that it matters much, but you don't need to sort your data if you use a dictionary anyway: Which means that one can build the dict line by line, as each is read, instead of reading the entire file into memory. So it does matter for int

Re: memory usage multi value hash

2011-04-14 Thread Peter Otten
christian wrote: > Hello, > > i'm not very experienced in python. Is there a way doing below more > memory efficient and maybe faster. > I import a 2-column file and then concat for every unique value in > the first column ( key) the value from the second > columns. > > So The ouptut is someth

memory usage multi value hash

2011-04-14 Thread christian
Hello, i'm not very experienced in python. Is there a way doing below more memory efficient and maybe faster. I import a 2-column file and then concat for every unique value in the first column ( key) the value from the second columns. So The ouptut is something like that. A,1,2,3 B,3,4 C,9,10,

Re: Memory Usage of Strings

2011-03-16 Thread Amit Dev
;|^USER\\>' amdev 6906 0.0 0.1 3508 1424 p0 R+9:57PM 0:00.00 egrep \\<6903\\>|^USER\\> (sh) Regards, Amit On Thu, Mar 17, 2011 at 3:21 AM, Dan Stromberg wrote: > > On Wed, Mar 16, 2011 at 8:38 AM, Amit Dev wrote: >> >> I'm observing a strange m

Re: Memory Usage of Strings

2011-03-16 Thread Dan Stromberg
On Wed, Mar 16, 2011 at 8:38 AM, Amit Dev wrote: > I'm observing a strange memory usage pattern with strings. Consider > the following session. Idea is to create a list which holds some > strings so that cumulative characters in the list is 100MB. > > >>> l = []

Re: Memory Usage of Strings

2011-03-16 Thread Terry Reedy
On 3/16/2011 3:51 PM, Santoso Wijaya wrote: ?? Python 2.7.1 (r271:86832, Nov 27 2010, 17:19:03) [MSC v.1500 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> L = [] >>> for i in xrange(10): ... L.append(str(i) * (1000 / len(

Re: Memory Usage of Strings

2011-03-16 Thread eryksun ()
On Wednesday, March 16, 2011 2:20:34 PM UTC-4, Amit Dev wrote: > > sum(map(len, l)) => 8200 for 1st case and 9100 for 2nd case. > Roughly 100MB as I mentioned. The two lists used approximately the same memory in my test with Python 2.6.6 on Windows. An implementation detail such as this

Re: Memory Usage of Strings

2011-03-16 Thread Santoso Wijaya
ap(len, l)) => 8200 for 1st case and 9100 for 2nd case. > Roughly 100MB as I mentioned. > > On Wed, Mar 16, 2011 at 11:21 PM, John Gordon wrote: > > In Amit Dev < > amit...@gmail.com> writes: > > > >> I'm observing a strange memory usage pa

Re: Memory Usage of Strings

2011-03-16 Thread Amit Dev
sum(map(len, l)) => 8200 for 1st case and 9100 for 2nd case. Roughly 100MB as I mentioned. On Wed, Mar 16, 2011 at 11:21 PM, John Gordon wrote: > In Amit Dev > writes: > >> I'm observing a strange memory usage pattern with strings. Consider >> the followi

Re: Memory Usage of Strings

2011-03-16 Thread John Gordon
In Amit Dev writes: > I'm observing a strange memory usage pattern with strings. Consider > the following session. Idea is to create a list which holds some > strings so that cumulative characters in the list is 100MB. > >>> l = [] > >>> for i in xrange(1

Memory Usage of Strings

2011-03-16 Thread Amit Dev
I'm observing a strange memory usage pattern with strings. Consider the following session. Idea is to create a list which holds some strings so that cumulative characters in the list is 100MB. >>> l = [] >>> for i in xrange(10): ... l.append(str(i) * (1000/len(str(

Re: 64 bit memory usage

2010-12-11 Thread Steve Holden
On 12/10/2010 2:03 PM, Rob Randall wrote: > I manged to get my python app past 3GB on a smaller 64 bit machine. > On a test to check memory usage with gc disabled only an extra 6MB was > used. > The figures were 1693MB to 1687MB. > > This is great. > > Thanks again fo

Re: 64 bit memory usage

2010-12-10 Thread Rob Randall
I manged to get my python app past 3GB on a smaller 64 bit machine. On a test to check memory usage with gc disabled only an extra 6MB was used. The figures were 1693MB to 1687MB. This is great. Thanks again for the help. On 10 December 2010 13:54, Rob Randall wrote: > You guys are ri

Re: 64 bit memory usage

2010-12-10 Thread Rob Randall
You guys are right. If I disable the gc it will use all the virtual RAM in my test. The application I have been running these tests for is a port of a program written in a LISP-based tool running on Unix. It does a mass of stress calculations. The port has been written using a python-based toolki

Re: 64 bit memory usage

2010-12-09 Thread John Nagle
On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote: On Wed, 8 Dec 2010 14:44:30 +, Rob Randall declaimed the following in gmane.comp.python.general: I am trying to understand how much memory is available to a 64 bit python process running under Windows XP 64 bit. When I run tests just creating

Re: 64 bit memory usage

2010-12-09 Thread Antoine Pitrou
On Thu, 9 Dec 2010 17:18:58 + Rob Randall wrote: > Basically the process runs at around 1% and it never seems to grow in size > again. > When running the C++ with python app the process slows when a new 'page' is > required but then goes back to 'full' speed. It does this until basically > all

Re: 64 bit memory usage

2010-12-09 Thread Rob Randall
I will give it a try with the garbage collector disabled. On 9 December 2010 17:29, Benjamin Kaplan wrote: > On Thursday, December 9, 2010, Rob Randall wrote: > > But the C++ program using up memory does not slow up. > > It has gone to 40GB without much trouble. > > > > Your C++ program probabl

Re: 64 bit memory usage

2010-12-09 Thread Benjamin Kaplan
On Thursday, December 9, 2010, Rob Randall wrote: > But the C++ program using up memory does not slow up. > It has gone to 40GB without much trouble. > Your C++ program probably doesn't have a garbage collector traversing the entire allocated memory looking for reference cycles. > Does anyone ha

Re: 64 bit memory usage

2010-12-09 Thread John Nagle
On 12/8/2010 11:40 PM, Ian Kelly wrote: Since a process need not have all its pages in physical memory simultaneously, there is no reason to suppose that a single process could not consume the entirety of the available virtual memory (minus what is used by the operating system) on a 64-bit system

Re: 64 bit memory usage

2010-12-09 Thread Rob Randall
Basically the process runs at around 1% and it never seems to grow in size again. When running the C++ with python app the process slows when a new 'page' is required but then goes back to 'full' speed. It does this until basically all the virtual memory is used. I have had memory exceptions when

Re: 64 bit memory usage

2010-12-09 Thread Rob Randall
But the C++ program using up memory does not slow up. It has gone to 40GB without much trouble. Does anyone have a 64 bit python application that uses more the 2GB? On 9 December 2010 16:54, Antoine Pitrou wrote: > On Wed, 8 Dec 2010 14:44:30 + > Rob Randall wrote: > > I am trying to under

Re: 64 bit memory usage

2010-12-09 Thread Antoine Pitrou
On Wed, 8 Dec 2010 14:44:30 + Rob Randall wrote: > I am trying to understand how much memory is available to a 64 bit python > process running under Windows XP 64 bit. > > When I run tests just creating a series of large dictionaries containing > string keys and float values I do not seem to

Re: 64 bit memory usage

2010-12-09 Thread Nobody
Rob Randall wrote: > I am trying to understand how much memory is available to a 64 bit python > process running under Windows XP 64 bit. > > When I run tests just creating a series of large dictionaries containing > string keys and float values I do not seem to be able to grow the process > bey

Re: Re: 64 bit memory usage

2010-12-09 Thread Heather Brown
On 01/-10/-28163 02:59 PM, Dennis Lee Bieber wrote: On Wed, 8 Dec 2010 14:44:30 +, Rob Randall declaimed the following in gmane.comp.python.general: I am trying to understand how much memory is available to a 64 bit python process running under Windows XP 64 bit. When I run tests just crea

Re: 64 bit memory usage

2010-12-08 Thread Ian Kelly
On 12/8/2010 11:42 PM, Dennis Lee Bieber wrote: The page file can be larger than physical memory because it contains memory "images" for multiple processes. However, all those "images" have to map into the physically addressable memory -- so a process is likely limited to physical memory,

64 bit memory usage

2010-12-08 Thread Rob Randall
. For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at around 2GB. On another machine with 16GB RAM and 24GB pagefile the process stalls at 16GB. In other tests where a C++ program loads and runs the python DLL, if C++ based operations are performed the memory usage will grow

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread Nobody
I don't mean compression. Just optimized for > memory usage, rather than performance. > > What I'm really looking for is a dict() that maps short unicode > strings into tuples with integers. Use UTF-8 encoded strings (str/bytes) as keys rather than unicode objects. -- http:

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread dmtr
I guess with the actual dataset I'll be able to improve the memory usage a bit, with BioPython::trie. That would probably be enough optimization to continue working with some comfort. On this test code BioPython::trie gives a bit of improvement in terms of memory. Not much though... >>

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread dmtr
> Looking at your benchmark, random.choice(letters) has probably less overhead > than letters[random.randint(...)]. You might even try to inline it as Right... random.choice()... I'm a bit new to python, always something to learn. But anyway in that benchmark (from http://bugs.python.org/issue952

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread Peter Otten
ried it on the real dataset. On the synthetic test it (and > sys.setcheckinterval(10)) gave ~2% speedup and no change in memory > usage. Not significant. I'll try it on the real dataset though. > > >> while building large datastructures used to speed up things >> significantly. Th

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread dmtr
Correction. I've copy-pasted it wrong! array.array('i', (i, i+1, i+2, i +3, i+4, i+5, i+6)) was the best. >>> for i in xrange(0, 100): d[unicode(i)] = (i, i+1, i+2, i+3, i+4, i+5, >>> i+6) 100 keys, ['VmPeak:\t 224704 kB', 'VmSize:\t 224704 kB'], 4.079240 seconds, 245143.698209 keys pe

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-07 Thread dmtr
tcheckinterval(10)) gave ~2% speedup and no change in memory usage. Not significant. I'll try it on the real dataset though. > while building large datastructures used to speed up things significantly. > That's what I would try first with your real data. > > Encoding your

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-06 Thread Peter Otten
dmtr wrote: >> > Well... 63 bytes per item for very short unicode strings... Is there >> > any way to do better than that? Perhaps some compact unicode objects? >> >> There is a certain price you pay for having full-feature Python objects. > > Are there any *compact* Python objects? Optimized fo

Re: Is there any way to minimize str()/unicode() objects memory usage ?[Python 2.6.4] ?

2010-08-06 Thread garabik-news-2005-05
ur data are write-once, then cdb has excellent performance (but a different API). The file will be usually cached in RAM, so no need to worry about I/O bottlenecks... and if small enough, you can always put it into a ramdisk. If your strings are long enough, you can improve memory usage with a us

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-06 Thread dmtr
On Aug 6, 10:56 pm, Michael Torrie wrote: > On 08/06/2010 07:56 PM, dmtr wrote: > > > Ultimately a dict that can store ~20,000,000 entries: (u'short > > string' : (int, int, int, int, int, int, int)). > > I think you really need a real database engine.  With the proper > indexes, MySQL could be ve

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-06 Thread Michael Torrie
On 08/06/2010 07:56 PM, dmtr wrote: > Ultimately a dict that can store ~20,000,000 entries: (u'short > string' : (int, int, int, int, int, int, int)). I think you really need a real database engine. With the proper indexes, MySQL could be very fast storing and retrieving this information for you.

Re: Is there any way to minimize str()/unicode() objects memory usage [Python 2.6.4] ?

2010-08-06 Thread Carl Banks
On Aug 6, 6:56 pm, dmtr wrote: > > > Well...  63 bytes per item for very short unicode strings... Is there > > > any way to do better than that? Perhaps some compact unicode objects? > > > There is a certain price you pay for having full-feature Python objects. > > Are there any *compact* Python o

  1   2   3   >