Hi I will try your code... meanwhile I have to say, as you pointed earlier and as stated in the documents, numpy is designed to handle large arrays and that is the reason I chose that. If there is a better option, please let me know.
Regards, Mahmood -------------------------------------------- On Wed, 5/10/17, Peter Otten <__pete...@web.de> wrote: Subject: Re: Out of memory while reading excel file To: python-list@python.org Date: Wednesday, May 10, 2017, 6:30 PM Mahmood Naderan via Python-list wrote: > Well actually cells are treated as strings and not integer or float > numbers. May I ask why you are using numpy when you are dealing with strings? If you provide a few details about what you are trying to achieve someone may be able to suggest a workable approach. Back-of-the-envelope considerations: 4GB / 5E6 cells amounts to >>> 2**32 / (100000 * 50) 858.9934592 about 850 bytes per cell, with an overhead of >>> sys.getsizeof("") 49 that would be 800 ascii chars, down to 200 chars in the worst case. If your strings are much smaller the problem lies elsewhere. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list