En Tue, 18 Sep 2007 10:58:42 -0300, Christoph Scheit  
<[EMAIL PROTECTED]> escribi�:

> I have to deal with several millions of data, actually I'm trying an  
> example
> with
> 360 grid points and 10000 time steps, i.e. 3 600 000 entries (and each  
> row
> consits of 4 int and one float)
> Of course, the more keys the bigger is the dictionary, but is there a  
> way to
> evaluate the actual size of the dictionary?

Yes, but probably you should not worry about it, just a few bytes per  
entry.
Why don't you use an actual database? sqlite is fast, lightweight, and  
comes with Python 2.5

>> >  # add row i and increment number of rows
>> >  self.rows.append(DBRow(self, self.nRows))
>> >  self.nRows += 1

This looks suspicious, and may indicate that your structure contains  
cycles, and Python cannot always recall memory from those cycles, and you  
end using much more memory than needed.

-- 
Gabriel Genellina

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to