amdescombes wrote: > Hi, > > I am using Python 2.5.1 > I have an application that reads a file and generates a key in a > dictionary for each line it reads. I have managed to read a 1GB file and > generate more than 8 million keys on an Windows XP machine with only 1GB > of memory and all works as expected. When I use the same program on a > Windows 2003 Server with 2GB of RAM I start getting MemoryError exceptions! > I have tried setting the IMAGE_FILE_LARGE_ADDRESS_AWARE on both > Python.exe and Python25.dll and setting the /3GB flag on the boot.ini > file to no avail. I still get the MemoryError exceptions. > > Has anybody encountered this problem before? > > Thanks in advance for any ideas/suggestions. > > Best Regards, > > André M. Descombes
How are you reading the large files? IMO, large files are better read in chunks: target_file = open(f, 'rb') while 1: data = target_file.read(8192000) if data: DO SOMETHING else: break The above reads 8MB at a time until the file has been completely read. Change the 8MB to whatever you like. -- http://mail.python.org/mailman/listinfo/python-list