Andreas Eisele <[EMAIL PROTECTED]> added the comment:
Even if they mean that creation
of a huge number N of objects
requires O(N*N) effort?
__
Tracker <[EMAIL PROTECTED]>
<http://bugs.pytho
Andreas Eisele <[EMAIL PROTECTED]> added the comment:
Great, that really solves my problem.
Thank you so much, Amaury!
As you say, the problem is unrelated to dicts,
and I observe it also when including the tuples to
a set or keeping them in lists.
Perhaps your GC thresholds would be
Andreas Eisele <[EMAIL PROTECTED]> added the comment:
Sorry for not giving a good example in the first place.
The problem seems to appear only in the presence of
sufficiently many distinct tuples. Then I see performance
that looks rather like O(n*n)
Here is an example that shows the p
New submission from Andreas Eisele <[EMAIL PROTECTED]>:
I need to count pairs of strings, and I use
a defaultdict in a construct like
count[a,b] += 1
I am able to count 50K items per second on a very fast machine,
which is way too slow for my application.
If I count complete string
Andreas Eisele added the comment:
> Then 7G is "enough" for the test to run.
yes, indeed, thanks for pointing this out.
It runs and detects an ERROR, and after applying your patch it succeeds.
What else needs to be done to make sure your patch finds it's way
Andreas Eisele added the comment:
> How do you run the test? Do you specify a maximum available size?
I naively assumed that running "make test" from the toplevel would be
clever about finding plausible parameters. However, it runs the bigmem
tests in a minimalistic way, skipping es
Andreas Eisele added the comment:
Tried
@bigmemtest(minsize=_2G*2+2, memuse=3)
but no change; the test is done only once with a small
size (5147). Apparently something does not work as
expected here. I'm trying this with 2.6 (Revision 59231).
__
Tr
Andreas Eisele added the comment:
Thanks a lot for the patch, which indeed seems to solve the issue.
Alas, the extended test code still does not catch the problem, at
least in my installation. Someone with a better understanding of
how these tests work and with access to a 64bit machine should
Andreas Eisele added the comment:
An instance of the other problem:
Python 2.5.1 (r251:54863, Aug 30 2007, 16:15:51)
[GCC 4.1.0 (SUSE Linux)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
__[1] >>> s="
Andreas Eisele added the comment:
For instance:
Python 2.5.1 (r251:54863, Aug 30 2007, 16:15:51)
[GCC 4.1.0 (SUSE Linux)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
__[1] >>> s=" "*int(5E9)
6
New submission from Andreas Eisele:
s.decode("utf-8")
sometimes silently truncates the result if s has more than 2E9 Bytes,
sometimes raises a fairly incomprehensible exception:
Traceback (most recent call last):
File "", line 2, in
File "/usr/lib64/python2.5/enc
11 matches
Mail list logo