On Sep 8, 8:06 pm, Bruno Desthuilliers <[EMAIL PROTECTED]> wrote: > Dr Mephesto a écrit : > > > Hi! > > > I would like to create a pretty big list of lists; a list 3,000,000 > > long, each entry containing 5 empty lists. My application will append > > data each of the 5 sublists, so they will be of varying lengths (so no > > arrays!). > > > Does anyone know the most efficient way to do this? > > Hem... Did you consider the fact that RAM is not an unlimited resource? > > Let's do some simple math (please someone correct me if I'm going off > the road): if a Python (empty) list object required 256 bits (if I refer > to some old post by GvR, it's probably more - 384 bytes at least. Some > Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to > build this list of lists. Which would make something around 3 Gb. Not > counting all other needed memory... > > FWIW, run the following code: > > # eatallramthenswap.py > d = {} > for i in xrange(3000000): > d[i] = ([], [], [], [], []) > > And monitor what happens with top...
Unused ram is wasted ram :) I tried using MySQL, and it was to slow. and I have 4gb anyway...
-- http://mail.python.org/mailman/listinfo/python-list