It's not only division (both on Sage 4.5* and on 4.6, with minor differences in memory usage figures): sage: get_memory_usage() 811.38671875 sage: A=vector(range(0,7000))*(1/2) sage: get_memory_usage() 3059.31640625
On Nov 13, 12:30 pm, Robert Bradshaw <rober...@math.washington.edu> wrote: > On Fri, Nov 12, 2010 at 7:10 PM, Jason Grout > > > > > > <jason-s...@creativetrax.com> wrote: > > On 11/12/10 8:48 PM, Jason Grout wrote: > > >> On 11/12/10 6:22 PM, Maxim wrote: > > >>> I get very high memory usage when I do something like: > >>> sage: get_memory_usage() > >>> -> 809.9453125 > >>> sage: A=vector(range(0,10000))/1 > >>> sage: get_memory_usage() > >>> -> 5393.2734375 > > >>> Which is a whooping 4.5GB+ of memory to hold a 10000 float vector... > > >>> I would have thought more of something along the lines of 24bytes/ > >>> float * 10000 floats + some overhead for the vector object ~= 240KB. > > > The problem seems to stem from the lines > > > cdef Element x = X.an_element() > > cdef Element y = Y.an_element() > > > inside of the detect_element_action function in coerce_actions.pyx. Notice: > > > sage: v=vector(range(10000)) > > sage: get_memory_usage() > > 206.84765625 > > sage: w=v.parent().an_element() > > sage: get_memory_usage() > > 3321.14453125 > > > I'm not sure why the coercion system *has* to construct an element, > > especially if such an element could potentially be expensive to compute and > > store. > > It only has to construct an element if it can't figure out what to do > after consulting the Parents themselves. > > > And then there's the matter you talk about; why is an element so big? > > The example above is quite strange. No idea why it should be so big. > (Note that these are are arbitrary precision integers, so the relative > overhead for small ones is still quite large). > > As for the original example, > > sage: type(vector(RR, range(100))) > <type 'sage.modules.free_module_element.FreeModuleElement_generic_dense'> > > Which means that each element is a full Python RealNumber, quite a bit > more than a double. Still doesn't explain the memory usage. For almost > any kind of linear algebra, you're better off using RDF, or even numpy > directly. > > There's something really fishy going on, as > > sage: get_memory_usage() > 222.671875 > sage: A=vector(range(0,1000)) > sage: get_memory_usage() > 222.671875 > sage: A=vector(range(0,1000))/1 > sage: get_memory_usage() # barely any more... > 253.67578125 > sage: A=vector(range(0,10000))/1 > # eats up *tons* of memory, swapping like crazy. > > - Robert -- To post to this group, send an email to sage-devel@googlegroups.com To unsubscribe from this group, send an email to sage-devel+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/sage-devel URL: http://www.sagemath.org