On Tue, 22 Nov 2016 10:27 am, Fillmore wrote:
>
> Hi there, Python newbie here.
>
> I am working with large files. For this reason I figured that I would
> capture the large input into a list and serialize it with pickle for
> later (faster) usage.
> Everything has worked beautifully until today
On Tue, 22 Nov 2016 11:40 am, Peter Otten wrote:
> Fillmore wrote:
>
>> Hi there, Python newbie here.
>>
>> I am working with large files. For this reason I figured that I would
>> capture the large input into a list and serialize it with pickle for
>> later (faster) usage.
>
> But is it really
On Mon, Nov 21, 2016 at 3:43 PM, John Gordon wrote:
> In Fillmore
> writes:
>
>
>> Question for experts: is there a way to refactor this so that data may
>> be filled/written/released as the scripts go and avoid the problem?
>> code below.
>
> That depends on how the data will be read. Here is
Fillmore wrote:
> Hi there, Python newbie here.
>
> I am working with large files. For this reason I figured that I would
> capture the large input into a list and serialize it with pickle for
> later (faster) usage.
But is it really faster? If the pickle is, let's say, twice as large as the
or
In Fillmore
writes:
> Question for experts: is there a way to refactor this so that data may
> be filled/written/released as the scripts go and avoid the problem?
> code below.
That depends on how the data will be read. Here is one way to do it:
fileObject = open(filename, "w")
for
Mok-Kong Shen wrote:
(It means that I have
to pickle out the list to a file and read in the content of
the file in order to have it as a bytearray etc. etc.)
No, you don't -- pickle.dumps() returns the pickled
data as a bytes object instead of writing it to a file.
--
Greg
--
https://mail.pyth
Gregory Ewing wrote:
> Mok-Kong Shen wrote:
>> I have yet a question out of curiosity: Why is my 2nd list structure,
>> that apparently is too complex for handling by eval and json, seemingly
>> not a problem for pickle?
>
> Pickle is intended for arbitrary data structures, so it
> is designed to
Am 15.04.2014 01:51, schrieb Gregory Ewing:
Mok-Kong Shen wrote:
I have yet a question out of curiosity: Why is my 2nd list structure,
that apparently is too complex for handling by eval and json, seemingly
not a problem for pickle?
Pickle is intended for arbitrary data structures, so it
is de
Mok-Kong Shen wrote:
I have yet a question out of curiosity: Why is my 2nd list structure,
that apparently is too complex for handling by eval and json, seemingly
not a problem for pickle?
Pickle is intended for arbitrary data structures, so it
is designed to be able to handle deeply-nested and
Am 14.04.2014 15:59, schrieb Peter Otten:
You could use json, but you may run into the same problem with that, too
(only later):
import json
items = []
for i in range(1000):
... s = json.dumps(items)
... items = [items]
...
Traceback (most recent call last):
File "", line 2, in
Mok-Kong Shen wrote:
> Am 14.04.2014 09:46, schrieb Peter Otten:
>
>> You ran into a limitation of the compiler. For us to suggest a workaround
>> you'd have to explain why you want to convert the list returned from
>> buildhuffmantree() into python source code and back.
>
> That list gives the
Am 14.04.2014 09:46, schrieb Peter Otten:
You ran into a limitation of the compiler. For us to suggest a workaround
you'd have to explain why you want to convert the list returned from
buildhuffmantree() into python source code and back.
That list gives the Huffman encoding tree for compressin
Mok-Kong Shen wrote:
> The code attached below produces in one of the two IMHO similar cases
> (excepting the sizes of the lists involved) MemoryError. Could experts
> kindly tell why that's so and whether there is any work-around feasible.
Here's a simpler way to reproduce the error:
>>> import
Mok-Kong Shen writes:
> The code attached below produces in one of the two IMHO similar cases
> (excepting the sizes of the lists involved) MemoryError. Could experts
> kindly tell why that's so and whether there is any work-around feasible.
"MemoryError" means: the Python process wants more mem
Is the whole program pure python? Sometimes a reference to undefined memory
or a lack of error checking elsewhere in a program, can cause innocuous code
to fail later: http://stromberg.dnsalias.org/~dstromberg/checking-early.html
If the program uses ctypes, or an unusual Python/C API module, I'd
On Feb 14, 2010, at 10:16 PM, Dave Angel wrote:
> There are three different limits at play here. Since you're still not saying
> how you're "measuring" usage, we've all been guessing just which one you're
> hitting. There's physical RAM, virtual address space, and swappable space
> (swapfile
egory, M.Sc., E.I.
Ph.D Candidate
University of Miami
Phone 305 284-3611
From: Chris Kaynor [ckay...@zindagigames.com]
Sent: Friday, February 12, 2010 7:44 PM
To: Echavarria Gregory, Maria Angelica
Cc: python-list@python.org
Subject: Re: MemoryError, can I use more?
On Feb 14, 2010, at 7:20 PM, Echavarria Gregory, Maria Angelica wrote:
>
> Dear Chris,
>
> One of the machines I tested my app in is 64 bit and happened the same. The
> RAM consumed by the OS and other processes is already included in the 2.2 I'm
> telling... my app enters to work when the RA
Am 14.02.10 12:28, schrieb Laszlo Nagy:
2010.02.13. 17:40 keltezéssel, Diez B. Roggisch írta:
Am 13.02.10 17:18, schrieb Anssi Saari:
Nobody writes:
A single process can't use much more than 2GiB of RAM without a
64-bit CPU
and OS.
That's not really true. Even Windows XP has the /3GB boot o
"Diez B. Roggisch" writes:
> No, PAE can be used to access much more memory than 4GB - albeit
> through paging. AFAIK up to 2^36 Bytes.
Anssi Saari wrote:
>That too. I admit, after checking, that you can't go above 3 GiB per
>process even in server Windows. But for Linux there exists (or
>exist
2010.02.13. 17:40 keltezéssel, Diez B. Roggisch írta:
Am 13.02.10 17:18, schrieb Anssi Saari:
Nobody writes:
A single process can't use much more than 2GiB of RAM without a
64-bit CPU
and OS.
That's not really true. Even Windows XP has the /3GB boot option to
allow 3 GiB per process. On PC
Am 13.02.10 17:18, schrieb Anssi Saari:
Nobody writes:
A single process can't use much more than 2GiB of RAM without a 64-bit CPU
and OS.
That's not really true. Even Windows XP has the /3GB boot option to
allow 3 GiB per process. On PCs, free operating systems and server
Windows can use PAE
Nobody writes:
> A single process can't use much more than 2GiB of RAM without a 64-bit CPU
> and OS.
That's not really true. Even Windows XP has the /3GB boot option to
allow 3 GiB per process. On PCs, free operating systems and server
Windows can use PAE to give access to full 4 GB per process
"Diez B. Roggisch" writes:
> Am 13.02.10 17:18, schrieb Anssi Saari:
>> Nobody writes:
>>
>>> A single process can't use much more than 2GiB of RAM without a 64-bit CPU
>>> and OS.
>>
>> That's not really true. Even Windows XP has the /3GB boot option to
>> allow 3 GiB per process. On PCs, free
On Fri, 12 Feb 2010 19:21:22 -0500, Echavarria Gregory, Maria Angelica
wrote:
> I am developing a program using Python 2.5.4 in windows 32 OS. The amount
> of data it works with is huge. I have managed to keep memory footprint
> low, but have found that, independent of the physical RAM of the mach
Echavarria Gregory, Maria Angelica wrote:
> Dear group:
>
> I am developing a program using Python 2.5.4 in windows 32 OS. The amount of
> data it works with is huge. I have managed to keep memory footprint low, but
> have found that, independent of the physical RAM of the machine, python
> alw
On Fri, Feb 12, 2010 at 7:21 PM, Echavarria Gregory, Maria Angelica
wrote:
> Dear group:
>
> I am developing a program using Python 2.5.4 in windows 32 OS. The amount of
> data it works with is huge. I have managed to keep memory footprint low, but
> have found that, independent of the physical
Echavarria Gregory, Maria Angelica wrote:
Dear group:
I am developing a program using Python 2.5.4 in windows 32 OS. The amount of
data it works with is huge. I have managed to keep memory footprint low, but
have found that, independent of the physical RAM of the machine, python always
gives
On Feb 12, 2010, at 7:21 PM, Echavarria Gregory, Maria Angelica wrote:
> Dear group:
>
> I am developing a program using Python 2.5.4 in windows 32 OS. The amount of
> data it works with is huge. I have managed to keep memory footprint low, but
> have found that, independent of the physical RA
A 32 bit app can only use 4 GB of memory itself (regardless of the amount of
system ram), the OS claims some of this for the system, dlls occupy some of
it, etc. As such, the app can only really use a smaller subset (generally
between 2 to 3 GB, depending upon the app and the OS).
Chris
On Fri,
Luis P. Mendes wrote:
> Sun, 21 Jun 2009 13:04:59 +, Lie Ryan escreveu:
>> Have you tried running without psyco? Psyco increases memory usage quite
>> significantly.
>>
>> If it runs well without psyco, you can try looking at your code and
>> selectively psyco parts that need the speed boost th
Sun, 21 Jun 2009 13:04:59 +, Lie Ryan escreveu:
> Luis P. Mendes wrote:
>> Hi,
>>
>> I have a program that uses a lot of resources: memory and cpu but it
>> never returned this error before with other loads:
>>
>> """
>> MemoryError
>> c/vcompiler.h:745: Fatal Python error: psyco cannot reco
Luis P. Mendes wrote:
> Hi,
>
> I have a program that uses a lot of resources: memory and cpu but it
> never returned this error before with other loads:
>
> """
> MemoryError
> c/vcompiler.h:745: Fatal Python error: psyco cannot recover from the
> error above
> Aborted
> """
> The last time I
En Wed, 31 Dec 2008 06:34:48 -0200, Steven D'Aprano
escribió:
Each time you are appending to the list, you append a tuple:
((i, j), sim)
where sim is a float and i and j are ints. How much memory does each of
those take?
sys.getsizeof( ((0, 1), 1.1) )
32
(On Windows, 32 bits, I get 36)
[BON]:
> above sim is floating type.
> s.append is totally coducted 60,494,500 times.
> but this code raise MemoryError.
>
> My computer has 4G RAM.
> i think it's enough. but it doesn't...
Try creating it in a more clean way, here an array of doubles:
>>> from array import array
>>> a = array("d
On Tue, 30 Dec 2008 22:02:49 -0800, [BON] wrote:
> ==
> s=[]
> for i in range(11000-1):
> for j in range(i+1, 11000):
>
> s.append(((i,j),sim))
> ==
> above sim is floating type.
> s.append is totally coducted 60,494,500 times. but t
[BON] ha scritto:
==
s=[]
for i in range(11000-1):
for j in range(i+1, 11000):
s.append(((i,j),sim))
==
above sim is floating type.
s.append is totally coducted 60,494,500 times.
but this code raise MemoryError.
My computer has
On Wed, Dec 31, 2008 at 4:17 PM, James Mills
wrote:
> I have no idea how many bytes of memory
> storing each element of a list consumes
> let alone each float object, but I assure you
> it's not going to be anywhere near that of
> 60494500 4-bytes spaces (do floats in C
> normally consume 4 bytes)
(Sorry for top posting):
You are mad! Why on God's earth would you want
to create a list containing 60 MILLION elements ?
What is the use case ? What are you solving ?
You may have 4G of ram, but I very seriously
doubt you have 4G of ram available to Python.
I have no idea how many bytes of mem
On 12 Sep., 16:39, Istvan Albert <[EMAIL PROTECTED]> wrote:
> This line reads an entire message into memory as a string. Is it
> possible that you have a huge email in there (hundreds of MB) with
> some attachment encoded as text?
No, the largest single message with the mbox is about 100KB large.
En Wed, 12 Sep 2007 11:39:46 -0300, Istvan Albert
<[EMAIL PROTECTED]> escribi�:
> On Sep 12, 5:27 am, Christoph Krammer <[EMAIL PROTECTED]>
> wrote:
>
>> string = self._file.read(stop - self._file.tell())
>> MemoryError
>
> This line reads an entire message into memory as a string. Is it
> p
Christoph Krammer <[EMAIL PROTECTED]> writes:
> I have to convert a huge mbox file (~1.5G) to MySQL.
Have you tried commenting out the MySQL portion of the code? Does the
code then manage to finish processing the mailbox?
--
http://mail.python.org/mailman/listinfo/python-list
On Sep 12, 5:27 am, Christoph Krammer <[EMAIL PROTECTED]>
wrote:
> string = self._file.read(stop - self._file.tell())
> MemoryError
This line reads an entire message into memory as a string. Is it
possible that you have a huge email in there (hundreds of MB) with
some attachment encoded as te
On 12 Sep., 12:20, David <[EMAIL PROTECTED]> wrote:
> It may be that Python's garbage collection isn't keeping up with your app.
>
> You could try periodically forcing it to run. eg:
>
> import gc
> gc.collect()
I tried this, but the problem is not solved. When invoking the garbage
collection afte
>
> My system has 512M RAM and 768M swap, which seems to run out at an
> early stage of this. Is there a way to clean up memory for messages
> already processed?
It may be that Python's garbage collection isn't keeping up with your app.
You could try periodically forcing it to run. eg:
import gc
Bugra Cakir wrote:
> Hi,
>
> Within a Python program how can we avoid getting "MemoryError" problem ?
>
Well that depends why you are getting it in the first place.
If you can post the traceback your program prints out then you might get
some specific advice for the current case. The usual advi
Bugra Cakir wrote:
> Within a Python program how can we avoid getting "MemoryError" problem ?
since you still haven't told us what your program is doing, and where in
the program you're getting the error, it's a bit hard to tell.
--
http://mail.python.org/mailman/listinfo/python-list
Bugra Cakir wrote:
> I have found a "MemoryError" exception in my program. How can i output
> Python interpreter log or how can i find the root cause of this
> "MemoryError" exception ?
this means that you've run out of memory; the ordinary traceback
print-out should tell you where.
http
Stephen G wrote:
> I am hesitant to make any changes to the python libraries as
> I need to distribute these scripts with a standard Python install.
well, you could at least check if the suggestions in that thread makes
the problem go away...
(if so, shipping a patched version with your progra
Fredrik,
Thanks for the response. I did see that, but having been dated 2005 I thought
that it might have been patched. I am also sometimes getting the same problem
with the urllib.py module. T
his may have to do with the interaction between Python and the mobile
optimization client that I
"Stephen G" <[EMAIL PROTECTED]> wrote:
> I get many exceptions when using the IMAP downloads. This happens
> randomly; sometimes the file downloads OK, and other times no.
> File "C:\Python25\lib\socket.py", line 308, in read
>data = self._sock.recv(recv_size)
>
> Is this a know bug or is t
Bernhard Reimar Hoefle wrote:
> I have the following python script:
> ###
> from numarray import *
>
> while 1:
> a=arange(1,3)
> b=a*100/100
> del a
> del b
> ###
>
> This script crashes after
On Tue, 18 Jan 2005 16:16:32 +0100, Thomas Heller wrote:
> Sylvain Thenault <[EMAIL PROTECTED]> writes:
>
>> Hi there !
>> I've noticed the following problem with python >= 2.3 (actually 2.3.4
>> and 2.4):
>>
>> [EMAIL PROTECTED]:test$ python
>> Python 2.3.4 (#2, Sep 24 2004, 08:39:09) [GCC 3.3.4
Sylvain Thenault <[EMAIL PROTECTED]> writes:
> Hi there !
> I've noticed the following problem with python >= 2.3 (actually 2.3.4 and
> 2.4):
>
> [EMAIL PROTECTED]:test$ python
> Python 2.3.4 (#2, Sep 24 2004, 08:39:09)
> [GCC 3.3.4 (Debian 1:3.3.4-12)] on linux2
> Type "help", "copyright", "credi
54 matches
Mail list logo