<[EMAIL PROTECTED]> wrote: >so i just have tried, even if i think it will not go to the end => i >was wrong : it is around 1.400.000 entries by dict... > >but maybe if keys of dicts are not duplicated in memory it can be done >(as all dicts will have the same keys, with different (count) values)?
I've had programs handling dicts with 1.7million items and it isn't a great problem, providing you're careful not to duplicate data. Creating a copy of keys() in a separate list, for example, will inflate memory usage noticably. >memory is 4Gb of ram, [ ... ] Unless you're on a 64bit OS, that's irrelevant. You'll hit the 2G per-process limit before you start putting a strain on real memory. -- \S -- [EMAIL PROTECTED] -- http://www.chaos.org.uk/~sion/ ___ | "Frankly I have no feelings towards penguins one way or the other" \X/ | -- Arthur C. Clarke her nu becomeþ se bera eadward ofdun hlæddre heafdes bæce bump bump bump
-- http://mail.python.org/mailman/listinfo/python-list