I wrote a script to process textual data and extract phrases from them, storing these phrases in a dictionary. It encounters a MemoryError when there are about 11.18M keys in the dictionary, and the size is about 1.5GB. I tried multiple times, and the error occurs everytime at exactly the same place (with the same number of keys in the dict). I then split the dictionary into two using a simple algorithm:
if str[0]<='m': dict=dict1 else: dict=dict2 #use dict... And it worked fine. The total size of the two dictionaries well exceeded 2GB yet no MemoryError occured. I have 1GB of pysical memory and 3GB in pagefile. Is there a limit to the size or number of entries that a single dictionary can possess? By searching on the web I can't find a clue why this problem occurs. -- http://mail.python.org/mailman/listinfo/python-list