Dear Dmitry, Bryan and Philip,
Thanks for the suggestions. I poked around the dictionary descriptions
and fiddled some more but couldn't find any obvious error. I agree it
does seem odd that a 50 kb dict should fail. Eventually, I tried
Dmitry suggestion of moving over to python 2.6. This took a w
On Thu, Jun 3, 2010 at 3:43 PM, Emin.shopper Martinian.shopper <
emin.shop...@gmail.com> wrote:
> Dear Experts,
>
> I am getting a MemoryError when creating a dict in a long running
> process and suspect this is due to memory fragmentation. Any
> suggestions would be welcome. Full details of the p
Philip Semanchuk wrote:
> At PyCon 2010, Brandon Craig Rhodes presented about how dictionaries
> work under the
> hood:http://python.mirocommunity.org/video/1591/pycon-2010-the-mighty-dict...
>
> I found that very informative.
That's a fine presentation of hash tables in general and Python's
ch
On Jun 4, 2010, at 12:06 PM, Bryan wrote:
Emin.shopper wrote:
dmtr wrote:
I'm still unconvinced that it is a memory fragmentation problem.
It's
very rare.
You could be right. I'm not an expert on python memory management.
But
if it isn't memory fragmentation, then why is it that I can
Emin.shopper wrote:
> dmtr wrote:
> > I'm still unconvinced that it is a memory fragmentation problem. It's
> > very rare.
>
> You could be right. I'm not an expert on python memory management. But
> if it isn't memory fragmentation, then why is it that I can create
> lists which use up 600 more MB
On Thu, Jun 3, 2010 at 10:00 PM, dmtr wrote:
> I'm still unconvinced that it is a memory fragmentation problem. It's
> very rare.
You could be right. I'm not an expert on python memory management. But
if it isn't memory fragmentation, then why is it that I can create
lists which use up 600 more M
I'm still unconvinced that it is a memory fragmentation problem. It's
very rare.
Can you give more concrete example that one can actually try to
execute? Like:
python -c "list([list([0]*xxx)+list([1]*xxx)+list([2]*xxx)
+list([3]*xxx) for xxx in range(10)])" &
-- Dmitry
--
http://mail.python.
On Thu, Jun 3, 2010 at 7:41 PM, dmtr wrote:
> On Jun 3, 3:43 pm, "Emin.shopper Martinian.shopper"
> wrote:
>> Dear Experts,
>>
>
> Are you sure you have enough memory available?
> Dict memory usage can jump x2 during re-balancing.
>
I'm pretty sure. When I did
p setattr(self,'q',dict([(xxx,xxx
> I have a long running processing which eventually dies to a
> MemoryError exception. When it dies, it is using roughly 900 MB on a 4
> GB Windows XP machine running Python 2.5.4. If I do "import pdb;
BTW have you tried the same code with the Python 2.6.5?
-- Dmitry
--
http://mail.python.org/ma
On Jun 3, 3:43 pm, "Emin.shopper Martinian.shopper"
wrote:
> Dear Experts,
>
> I am getting a MemoryError when creating a dict in a long running
> process and suspect this is due to memory fragmentation. Any
> suggestions would be welcome. Full details of the problem are below.
>
> I have a long r
Dear Experts,
I am getting a MemoryError when creating a dict in a long running
process and suspect this is due to memory fragmentation. Any
suggestions would be welcome. Full details of the problem are below.
I have a long running processing which eventually dies to a
MemoryError exception. When
11 matches
Mail list logo