[DJTB]
> I'm trying to manually parse a dataset stored in a file. The data should be
> converted into Python objects.
>
> Here is an example of a single line of a (small) dataset:
>
> 3 13 17 19 -626177023 -1688330994 -834622062 -409108332 297174549 955187488
> 589884464 -1547848504 857311165 585
[Jeremy Hylton]
> ...
> The ObjectInterning instance is another source of problem, because it's
> a dictionary that has an entry for every object you touch.
Some vital context was missing in this post. Originally, on c.l.py, DJTB
wasn't using ZODB at all. In effect, he had about 5000 lists each
[Jeremy Hylton]
> ...
> It looks like your application has a single persistent instance -- the
> root ExtendedTupleTable -- so there's no way for ZODB to manage the
> memory. That object and everything reachable from it must be in memory
> at all times.
Indeed, I tried running this program under
[Greg Ewing]
> Can someone give me a hint for No. 10? My MindBlaster
> card must be acting up -- I can't seem to tune into
> the author's brain waves on this one.
There are hints on the site; for level 10,
http://www.pythonchallenge.com/forums/viewtopic.php?t=20
> I came up with what I thoug
[Rune Strand]
> I'm experiencing strange errors both with pickle and cPickle in the
> below code:
>
>
> import cPickle as pickle
> #import pickle
> from string import ascii_uppercase
> from string import ascii_lowercase
>
> def createData():
>d1 = list("Something's rotten")
>d2 = tuple('in
[Tim Peters]
>> What is "XWwz"? Assuming it's a bizarre typo for "open", change the
>> 'w' there to 'wb'. Pickles are binary data, and files holding pickles
>> must be opened in binary mode, especially since:
>>
>>&
[Gary Robinson]
> I know the Global Interpreter Lock ensures that only one python thread
> has access to the interpreter at a time, which prevents a lot of
> situations where one thread might step on another's toes.
Not really. The CPython implementation's C code relies on the GIL in
many ways t
[Gary Robinson]
> In the application we're writing (http://www.goombah.com) it would be
> helpful for us to give one thread a higher priority than the others. We
> tried the recipe here:
> http://groups-beta.google.com/group/comp.lang.python/msg/6f0e118227a5f5de
> and it didn't seem to work for us.
[attribution lost]
...
>>> Yup: the workaround seems to be as simple as replacing all
occurrences
>>> of -0.0 with -(0.0). I'm embarrassed that I didn't figure this out
>>> sooner.
>>>
>>> >>> x, y = -(0.0), 0.0
>>> >>> x, y
>>> (-0.0, 0.0)
[Alex Martelli]
>> Glad it works for you, but it's the
[Tim Peters]
...
>> Huh. I don't read it that way. If it said "numbers can be ..." I
>> might, but reading that way seems to requires effort to overlook the
>> "decimal" in "decimal numbers can be ...".
[Nick Maclaren]
> I wouldn'
[Nick Maclaren]
>> ...
>> Yes, but that wasn't their point. It was that in (say) iterative
>> algorithms, the error builds up by a factor of the base at every
>> step. If it wasn't for the fact that errors build up, almost all
>> programs could ignore numerical analysis and still get reliable
>> a
[Stuart D. Gathman]
> I am trying to create a doctest test case for the following:
>
> def quote_value(s):
> """Quote the value for a key-value pair in Received-SPF header
> field if needed. No quoting needed for a dot-atom value.
>
> >>> quote_value(r'abc\def')
> '"abcdef"
[Robert Kern]
> ...
> ph3 = math.atan( ac3.imag / ac3.real )
> ...
Don't do that: atan2 is the correct way to compute the angle, because
the signs of both inputs are needed to determine the correct quadrant.
So do:
ph3 = math.atan2(ac3.imag, ac3.real)
instead.
--
http://mail.python.o
[Roy Smith]
> I certainly agree about using atan2() instead of atan(), but I'm surprised
> there's not an easier way to get the phase of a complex, just like abs()
> gives you the modulus. I can see why you wouldn't want to pollute the
> global namespace with another built-in just for this purpose
[Michal Kwiatkowski]
> I was just wondering...
>
> Python 2.3.5 (#2, Mar 6 2006, 10:12:24)
> [GCC 4.0.3 20060304 (prerelease) (Debian 4.0.2-10)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> >>> import timeit
> >>> a = timeit.Timer('2**1')
> >>> b = t
[EMAIL PROTECTED]
> I think there might be something wrong with the implementation of
> modulus.
>
> Negative float values close to 0.0 break the identity "0 <= abs(a % b) <
> abs(b)".
While that's a mathematical identity, floating point has finite
precision. Many mathematical identities can fai
[Alex Martelli]
>> ...
>> You can compute the requested answer exactly with no random number
>> generation whatsoever: compute the probability of each result from
>> 0 to 1000, then sum the probabilities of entries that are exactly 390
>> apart.
[Elliot Temple]
> That was the plan, but how do I ge
[Elliot Temple]
> I think I got it. I noticed my code is essentially the same as Tim
> Peter's (plus the part of the problem he skipped). I read his code 20
> minutes before recreating mine from Alex's hints. Thanks!
>
> def main():
> ways = ways_to_roll()
> total_ways = float(101**10)
>
[Jeffrey Barish]
> Several methods in Queue.Queue have warnings in their doc strings that they
> are not reliable (e.g., qsize). I note that the code in all these methods
> is bracketed with lock acquire/release. These locks are intended to
> protect the enclosed code from collisions with other t
[EMAIL PROTECTED]
> I see a C/python program that we're using spending a lot of time in
> this function, far more than we think it should. What is it?
PyEval_EvalFrame is the heart of the CPython interpreter: it's a very
large function that _implements_ the interpreter, marching through the
byte
[kyo guan]
> Python version 2.4.3
>
> >>> l=range(50*1024*100)
>
> after this code, you can see the python nearly using about 80MB.
>
> then I do this
>
> >>> del l
>
> after this, the python still using more then 60MB, Why the python don't free
> my
> memory?
It's that you've created 5 million i
[Russell Warren]
|> Does anyone have an easier/faster/better way of popping from the middle
> of a deque than this?
>
> class mydeque(deque):
> def popmiddle(self, pos):
> self.rotate(-pos)
> ret = self.popleft()
> self.rotate(pos)
> return ret
As Tim Chase said, the easiest way
[Russell Warren]
> ...
> As to indexing into a deque being O(index)... I didn't realize that.
> It is certainly something to keep in mind, though... looping through
> the contents of a deque would obviously be a bad idea with this being
> the case! I wonder if the generator for the deque helps red
[Andrew Koenig, on the counter intuitive -1e-050 % 2.0 == 2.0 example]
>> I disagree. For any two floating-point numbers a and b, with b != 0, it
>> is always possible to represent the exact value of a mod b as a
>> floating-point number--at least on every floating-point system I have ever
>> enco
[Vedran Furač]
> I think that this results must be the same:
>
> In [3]: math.atan2(-0.0,-1)
> Out[3]: -3.1415926535897931
Whether -0.0 and 0.0 are different floats internally depends on your
hardware floating-point; on most machines today, they are different
floats, but _compare_ equal to each ot
[EMAIL PROTECTED]
> Below are 2 files that isolate the problem. Note, both programs hang
> (stop responding)
What does "stop responding" mean?
> with hyper-threading turned on (a BIOS setting), but
> work as expected with hyper-threading turned off.
>
> Note, the Windows task manager shows 2 CPU
[EMAIL PROTECTED]
> Because of multithreading semantics, this is not reliable. This
> sentence is found in the Python documentation for "7.8.1 Queue
> Objects".
>
> This scares me! Why would Queue.qsize(), Queue.empty( ), and a
> Queue.full() not be reliable?
Because they may not be telling the
>> What do you mean "stop responding"?
[EMAIL PROTECTED]
> Both threads print their thread numbers (either 1 or 2) approximately
> every 10 seconds. However, after a while (minutes to hours) both
> programs (see above) hang!
Where "hang" means they stop printing.
> Pressing ctrl-c (after the pr
[David C.Ullrich]
> Would there be issues (registry settings, environment
> variables, whatever) if a person tried to install
> versions 1.x and 2.x simultaneously on one Windows
> system? Windows 98, if it matters.
>
> (I can handle the file associations with no problem.)
There are generally no i
[Tim Peters]
>> I didn't run it for hours ;-)
[EMAIL PROTECTED]
> Please try.
OK, I let the first test program run for over 24 hours by now. It
never hung. Overnight, the box did go into sleep mode, but the test
woke itself up after sleep mode ended, and the threads reported they
[Michele Petrazzo]
> I'm doing some tests on my debian testing and I see a very strange
> memory problem with py 2.5a2 (just downloaded) and compiled with gcc
> 4.1.0, but not with the gcc 3.3.5:
>
> My test are:
>
> #--test.py
> import sys
> if sys.version.startswith("2.3"):
> from sets import S
[EMAIL PROTECTED]
> Below are 2 files. The first is a Python program that isolates the
> problem within less than 1 hour (often just a few minutes).
It does not on my box. I ran that program, from a DOS shell, using
the released Windows Python 2.4.3. After an hour, it was still
printing. I lef
[Serge Orlov]
> BTW python 2.5 now returns free memory to OS, but if a program keeps
> allocating more memory with each new iteration in python 2.4, it will
> not help.
No version of CPython ever returns memory to "the OS". All memory is
obtained via the platform C's alloc() or realloc(), and any
[EMAIL PROTECTED]
> Hi, I've written a top-down recursive decent parser for SPICE circuit
> descriptions. For debugging purposes, I wanted each production
> rule/function to know what its own name was, so at the beginning of
> each rule/function, I make a call to inspect.stack()[0][3] (I think...)
[Richard Meraz]
> We need to capture more than 99 named groups using python regular
> expressions.
> ...
> its clear why the language designers have decided on this limitation. For
> our system, however, it is essential that we be able to capture an arbitrary
> number of groups.
>
> Could anyone o
[raghu, on Heiko Wundram's test program:
import sys
x = {}
i = 0
def test():
global x, i
x[i] = "test"
i += 1
del x[i-1] # Properly clean up x.
for j in xrange(1):
print "Before", j, ":", sys.gettotalrefcount()
test()
print "After", j, ":", sys.gettotalrefcount()
]
> Hm
[Boris Borcic]
> Assuming that the items of my_stream share no content (they are
> dumps of db cursor fetches), is there a simple way to do the
> equivalent of
>
> def pickles(my_stream) :
> from cPickle import load,dumps
> while 1 :
> yield dumps(load(my_stream))
>
> without the
[elventear]
> I am the in the need to do some numerical calculations that involve
> real numbers that are larger than what the native float can handle.
>
> I've tried to use Decimal, but I've found one main obstacle that I
> don't know how to sort. I need to do exponentiation with real
> exponents,
[John Salerno, on the difference between `open` and `file`]
> Interesting. What is the difference between them now?
In 2.5 `file` is unchanged but `open` becomes a function:
>>> file
>>> open
--
http://mail.python.org/mailman/listinfo/python-list
[Tim Peters]
>> In 2.5 `file` is unchanged but `open` becomes a function:
>>
>> >>> file
>>
>> >>> open
>>
[Paul Rubin]
> So which one are we supposed to use?
Use for what? If you're trying to check an object's type, use the
[Raymond L. Buvel, on
http://calcrpnpy.sourceforge.net/clnumManual.html
]
> The clnum module handles this calculation very quickly:
>
> >>> from clnum import mpf
> >>> mpf("1e1") ** mpf("3.01")
> mpf('9.99932861e30099',26)
That's probably good enough for the OP's needs
[Raymond L. Buvel, on
http://calcrpnpy.sourceforge.net/clnumManual.html
]
>>> The clnum module handles this calculation very quickly:
>>>
>>> >>> from clnum import mpf
>>> >>> mpf("1e1") ** mpf("3.01")
>>> m
[EMAIL PROTECTED]
> ##Holy Mother of Pearl!
> ##
> ##>>> for i in range(10):
> ##for j in range(10):
> ##print '%4d' % (gmpy.mpz(i)*gmpy.mpz(j)),
> ##print
> ##
> ##
> ## 0000000000
> ## 012
[John Machin, quoting reindent.py docs]
>> remove empty lines at the end of files. Also ensure the last line ends
>> with a newline.
[John Salerno]
> don't those two things conflict with one another?
No. This is the repr of a file with (3) empty lines at the end:
"a file\n\n \n \t\n"
[Paul Rubin]
> ...
> When I try to do it in a separate thread:
>
> import time, itertools
> def remote_iterate(iterator, cachesize=5):
> # run iterator in a separate thread and yield its values
> q = Queue.Queue(cachesize)
> def f():
> print 'thread start
pyright (c) 1999-2008 Tim Peters
Permission is hereby granted, free of charge, to any person obtaining
a copy
of this software and associated documentation files (the "Software"),
to deal
in the Software without restriction, including without limitation the
rights
to use, copy, modify,
[EMAIL PROTECTED]
> Hi everybody,
> I have a problem with Python/C API and memory management.
>
> I'm using
> Python 2.3.5 (#1, Jan 4 2006, 16:44:27)
> [GCC 4.0.2 20050901 (prerelease) (SUSE Linux)] on linux2
>
> In my C-module I have a loop like this:
> ***
[robert]
> ...
> PS: how does ZODB work with this kind of problem? I thought is uses cPickle?
It does. Each thread in a ZODB application typically uses its own
connection to a database. As a result, each thread gets its own
consistent view of database objects, which can (and routinely does)
vary
[Paul Rubin]
> It looks to me like you can't have two threads in the same generator:
You can't even have one thread in a generator-iterator get away with
activating the generator-iterator while it's already active. That's
an example in PEP 255:
"""
Restriction: A generator cannot be resumed w
[Duncan Booth]
> No, Python doesn't run the garbage collector when it is exiting.
Actually, it does. What it doesn't do is call the garbage collector
twice when it exits, although it used to ;-)
> What it does is to delete all the globals from each module in turn. So:
Yup. The code is in funct
[EMAIL PROTECTED]
> Icon is a language that share some similarities with Python:
> http://www.cs.arizona.edu/icon/
>
> During the execution of a Icon script there are ways to visualize the
> memory:
> http://www.cs.arizona.edu/icon/progvis/memmon/memmon.htm
>
> Related pages:
> http://www.cs.arizon
> For more details about the plan for Python 2.5, see:
>
> http://www.python.org/doc/peps/pep-0356/
Looks like links to PEPs are completely hosed at the moment. For
example, the link above displays an empty directory, and
http://www.python.org/doc/peps
displays a directory full of empty
[EMAIL PROTECTED] <[EMAIL PROTECTED]>[
> For writing testcode, it looks like there's three ways that it's
> typically done:
>
> (1). using the doctest module,
>
> (2). using the unittest module (i.e. "pyunit"), or else
>
> (3). just putting an "if __name__ = '__main__':" at the bottom of your
> mod
[Douglas Alan]
>> I've noticed that there is little to no spam in comp.lang.python
>> and am wondering how this is accomplished.
[Skip Montanaro]
> Most mailing lists which originate on mail.python.org have SpamBayes
> filtering in front of them.
Worth noting that the SpamBayes project started sp
[lord trousers]
>>> Is there a way I can get hold of these kinds of statistics for
>>> debugging?
[Martin v. Löwis]
>> This is best done when Python is build in debug mode.
>> sys.gettotalrefcount then gives you the number of INCREF
>> calls for which no DECREF has been made; you said that
>> this
[Steve R. Hastings]
> So, Python 2.5 will have new any() and all() functions.
> http://www.python.org/dev/peps/pep-0356/
>
>
> any(seq) returns True if any value in seq evaluates true, False otherwise.
>
> all(seq) returns True if all values in seq evaluate true, False otherwise.
>
> I have a quest
[Steven D'Aprano]
> ...
> While the implemented behaviour might be more practical than the
> alternatives, it is still worrying paradoxical. If "All sheep are woolly",
> then obviously it must also be true that "Any sheep is woolly". More
> formally, if all(X), then any(X) -- except for the case of
[jUrner]
>> def calc_time_res():
>> now = time.time
>> start = now()
>> x = start
>> while start == x:
>> x = now()
>> print x, start # <--
>> print x - start
>>
>> print calc_time_res()
>> 1.50203704834e-05
>>
>> Something is going wrong here.
>> If you look at the
[John Salerno]
> Is 'Python 3000' just a code name for version 3.0, or will it really be
> called that when it's released?
The smart money is on changing the name to Ecstasy, to leverage
marketing publicity from the hallucinogenic club drug of the same
name. "class" will be renamed to "rave", and
[Paul Du Bois]
> Using win32 python 2.4.1, I have a minimal test program:
>
> def generate():
> raise TypeError('blah')
> yield ""
>
> print "\n".join((generate()))
>
> Executing the program gives:
>
> Traceback (most recent call last):
> File "", line 5, in ?
> TypeError: sequence expect
[Olivier Langlois]
> ...
> I have kept thinking about the original problem and I now believe that
> the only solution if he wants to store 3.6GB of data in a Python script
> is to recompile Python in 64 bits. I do not know if this is something
> that someone has already done successfully...
I did
[EMAIL PROTECTED]
>> what's the standard way for a "for" loop with float increments?
[Dan Sommers]
> Use a while loop instead:
>
> f = initial_value
> while f <= final_value:
> process(f)
> f = f + increment
>
> Note that there is no general guarantee that f will actually b
[Marco Sulla ]
> Excuse me, Tim Peters, what do you think about my (probably heretical)
> proposal of simply raising an exception instead of return a NaN, like
> Python already do for division by zero?
Sorry, I'm missing context. I don't see any other message(s) from you
in th
301 - 363 of 363 matches
Mail list logo