[Bryan Olson]
> ...
> For sorting, we had the procedure 'sort', then added the pure
> function 'sorted'. We had a 'reverse' procedure, and wisely
> added the 'reversed' function.
>
> Hmmm... what we could we possible do about 'shuffle'?
'permuted' is the obvious answer, but that would leave us ope
[EMAIL PROTECTED]
> I have a super-simple need to just walk the files in a single directory.
>
> I thought this would do it, but "permanentFilelist" ends up containing
> all folders in all subdirectories.
All folders everywhere, or all file (not directory) names in the top
two levels? It looks li
[Alex Martelli]
...
>> In mathematics, 1 is not "the same" as 1.0 -- there exists a natural
>> morphism of integers into reals that _maps_ 1 to 1.0, but they're still
>> NOT "the same" thing. And similarly for the real-vs-complex case.
[Xavier Morel]
> I disagree here, 1 and 1.0 are the same math
[Paul Rubin]
...
>> David J.C. MacKay
>> Information Theory, Inference, and Learning Algorithms
>>
>> Full text online:
>> http://www.inference.phy.cam.ac.uk/mackay/itila/
...
>> The printed version is somewhat expensive, but according to the
>> following analysis it's a better barg
[3c273]
> I'm just curious as to why the default rounding in the decimal module is
> ROUND_HALF_EVEN instead of ROUND_HALF_UP.
Because it's the best (numerically "fairest") rounding method for most
people most of the time.
> All of the decimal arithmetic I do is rounded half up and I can't think
[LordLaraby]
> If 'bankers rounding' is HALF_ROUND_EVEN, what is HALF_ROUND_UP?
Not banker's rounding ;-). Same answer if you had said ROUND_HALF_UP
instead (which I assume you intended) -- most of these don't have
cute names.
> I confess to never having heard the terms.
ROUND_HALF_UP etc are
[Paul Rubin]
>> I wouldn't have figured out that a "car park" was a parking lot. I
>> might have thought it was a park where you go to look at scenery from
>> inside your car. Sort of a cross between a normal park and a drive-in
>> movie.
[Grant Edwards[
> ;)
>
> That's a joke, right?
Probably
[Kay Schluehr]
>> This is interesting. If we define
>>
>> def f():
>>print str(1.1)
>>
>> and disassemble the function, we get:
>>
> dis.dis(f)
> 2 0 LOAD_GLOBAL 0 (str)
> 3 LOAD_CONST 1 (1.1001) # huh?
[Fredrik Lundh]
> huh h
[Joshua Luben]
> I thought I would post this here first before seeking more experienced ears
> for this particular strangness.
>
> I have Python 2.4.2 installed from source on a dual processor dell server.
> These are x86_64 processors (verified by /bin/arch) (aka emt64 extensions).
>
> uname -a gi
[Bryan Olson]
>> Does no one care about an internal error in the regular expression
>> engine?
[Steve Holden]
> Not one that requires parsing a 100 kilobyte re that should be replaced
> by something more sensible, no.
I care: this is a case of not detecting information loss due to
unchecked down
[Claudio Grondi]
>> Python 2.4.2 (#67, Sep 28 2005, 12:41:11) [MSC v.1310 32 bit
(Intel)] on win32 - IDLE 1.1.2
>> >>> a=[]
>> >>> a.append(a)
>> >>> b=[]
>> >>> b.append(b)
>> >>> a==b
>>
>> Traceback (most recent call last):
>>File "", line 1, in -toplevel-
>> a==b
>> RuntimeError:
[Kay Schluehr]
> I concur and I wonder why CAS like e.g. Maple that represent floating
> point numbers using two integers [1] are neither awkward to use nor
> inefficient.
My guess is that it's because you never timed the difference in Maple
-- or, perhaps, that you did, but misinterpreted the res
[Grant Edwards]
>> ...
>> The low 32 bits match, so perhaps you should just use that
>> portion of the returned hash?
>>
>> >>> hex(12416037344)
>> '0x2E40DB1E0L'
>> >>> hex(-468864544 & 0x)
>> '0xE40DB1E0L'
>>
>> >>> hex(12416037344 & 0x)
>> '0xE40DB1E0L'
>> >>> hex
[ Boris Borcic]
> x.sort(cmp = lambda x,y : cmp(random.random(),0.5))
>
> pick a random shuffle of x with uniform distribution ?
Say len(x) == N. With Python's current sort, the conjecture is true
if and only if N <= 2.
> Intuitively, assuming list.sort() does a minimal number of comparisons to
...
[Jon Smirl]
> I know in advance how many items will be added to the dictionary. Most
> dictionary implementations I have previously worked with are more
> efficient if they know ahead of time how big to make their tables.
Richard Jones spent considerable time investigating whether
"pre-sizing
[EMAIL PROTECTED]
> Admittedly this problem causes no actual functional issues aside from
> an occasional error message when the program exits. The error is:
>
> Unhandled exception in thread started by
> Error in sys.excepthook:
> Original exception was:
>
> Yes all that info is blank.
That's ty
[Simen Haugen]
>>> How can I convert a python datetime to a timestamp? It's easy to convert
>>> a timestamp to datetime (datetime.datetime.fromtimestamp(), but the
>>> other way around...?)
[John Machin]
>> Is the timetuple() method what you want?
>>
>> #>>> import datetime
>> #>>> n = datetime.da
[Peter Hansen]
>> I'm investigating a puzzling problem involving an attempt to
>> generate a constant containing an (IEEE 754) "infinity" value. (I
>> understand that special float values are a "platform-dependent
>> accident" etc...)
[also Peter]
> ...
> My guess about marshal was correct.
Yup.
[Tim Peters]
>...
>> It has a much better chance of working from .pyc in Python 2.5.
>> Michael Hudson put considerable effort into figuring out whether the
>> platform uses a recognizable IEEE double storage format, and, if so,
>> marshal and pickle take di
[Dan Christensen]
> My student and I are writing a C extension that produces a large
> integer in binary which we'd like to convert to a python long. The
> number of bits can be a lot more than 32 or even 64. My student found
> the function _PyLong_FromByteArray in longobject.h which is exactly
>
[Ben Finney]
>> I don't see why you're being so obtuse
[Terry Reedy]
> I think name calling is out of line here.
Name calling is always out of line on comp.lang.python. Unless it's
done by Guido. Then it's OK. Anyone else, just remind them that even
Hitler had better manners. That always calm
[EMAIL PROTECTED]
> Has anyone ever think about a set wich references its elements weakly ?
Yes, and there are excruciating subtleties. I only implemented as
much of one as ZODB needed at the time:
# A simple implementation of weak sets, supplying just enough of Python's
# sets.Set interface for
[David Hirschfield]
> Question from a post to pygtk list...but it probably would be better
> answered here:
>
> I encountered a nasty problem with an external module conflicting with
> my python threads recently, and right now the only fix appears to be to
> turn off garbage collection while the cr
[Giovanni Bajo]
> I understand your concerns, but I have to remember you that most bug reports
> submitted by users go totally ignored for several years, or, better, forever.
> I
> do not have a correct statistic for this,
Indeed you do not.
> but I'm confident that at least 80% of the RFE or pa
[Martitza]
|> Hi. I work for a small company (actually in process of forming)
> interested in embedding or extending python as part of our commercial
> non-open-source product. We have legal counsel, but are interested in
> the spirit as well as the letter of the law. Not much seems to have
> be
[Martitza]
> Mr. Peters:
Na, my father's dead -- you can call me Uncle Timmy ;-)
> Thank you for so kindly taking the time to resolve my misunderstandings
> and to elaborate on the intent of the PSF.
>
> In particular, thank you for explaining in plain language how the
> licenses stack. I'm sure
[EMAIL PROTECTED]
> ...
> G5-fiwihex:~ eur$ python
> Python 2.3.5 (#1, Mar 20 2005, 20:38:20)
> [GCC 3.3 20030304 (Apple Computer, Inc. build 1809)] on darwin
> Type "help", "copyright", "credits" or "license" for more information.
> >>> import time
> >>> time.time()
> 1160580871.258379
> >>>
>
> M
[Frederic Rentsch]
>Working with read and write operations on a file I stumbled on a
> complication when writes fail following a read to the end.
>
> >>> f = file ('T:/z', 'r+b')
> >>> f.write ('abcdefg')
> >>> f.tell ()
> 30L
> >>> f.seek (0)
> >>> f.read ()
> 'abcdefg'
> >>> f.flush ()
[Frederic Rentsch]
> Thanks a lot for your input. I seemed to notice that everything
> works fine without setting the cursor as long as it stops before the end
> of the file. Is that also a coincidence that may not work?
"if you want to read following a write, or write following a read, on
[John Henry]
> If I have a bunch of sets:
>
> a = set((1, 2, 3))
> b = set((2, 3))
> c = set((1, 3))
>
>
> What's the cleanest way to say:
>
> 1) Give me a list of the items that are in all of the sets? (3 in the
> above example)
list(a & b & c)
> 2) Give me a list of the items that are not
[Petra Chong]
> I am using Python 2.3 and ZODB (without the rest of Zope) with the
> following pattern:
>
> * One process which writes stuff to a ZODB instance (call it test.db)
> * Another process which reads stuff from the above ZODB instance
> test.db
>
> What I find is that when the first proce
]Ernesto García García]
> it's very common that I have a list and I want to print it with commas
> in between. How do I do this in an easy manner, whithout having the
> annoying comma in the end?
>
>
>
> list = [1,2,3,4,5,6]
>
> # the easy way
> for element in list:
>print element, ',',
>
> pr
[EMAIL PROTECTED]
> ...
> As I see it, reference copying is a very useful performance and memory
> optimization. But I don't think it should undermine the validity of
> assert(a==b) as a predictor of invariance under identical operations.
So, as Alex said last time,
Try concisely expressing
[Scott David Daniels]
>> For example, time timsort (Python's internal sort) on pre-sorted
>> data; you'll find it is handled faster than random data.
O(N) vs O(N log N), in fact.
[Lawrence D'Oliveiro]
> But isn't that how a reasonable sorting algorithm should behave? Less
> work to do if the data
[Jim Segrave]
> Actually, presorted lists are not a bad case for heapsort - it's quite
> immune to any existing order or lack thereof,
Write a heapsort and time it. It's not a difference in O() behavior,
but more memory movement is required for a sorted list because
transforming the list into a m
[Tim Peters]
>> ...
>> O(N log N) sorting algorithms helped by pre-existing order are
>> uncommon, unless they do extra work to detect and exploit
>> pre-existing order.
[Lawrence D'Oliveiro]
> Shellsort works well with nearly-sorted data. It's basically a
[Wojciech Muła]
>> You have to use operator **, i.e. 34564323**456356
Or the builtin pow() instead of math.pow().
[Gary Herron]
> That's not very practical. That computation will produce a value with
> more than 3.4 million digits.
Yes.
> (That is, log10(34564323)*456356 = 3440298.) Python will
[Nick Maclaren]
Firstly, a FAR more common assumption is that integers wrap in twos'
complement - Python does not do that.
[Grant Edwards]
>>> It used to
[Fredrik Lundh]
>> for integers ? what version was that ?
[Grant]
> Am I remebering incorrectly?
Mostly but not entirely.
> Didn'
[EMAIL PROTECTED]
> I would think everytime you add an item to a list you must increase
> reference count of that item.
_Someone_ needs to. When the function called to add the item does the
incref itself, then it would be wrong for the caller to also incref
the item.
> http://docs.python.org/api
[Carl J. Van Arsdall]
> Hey everyone, cPickle is raising an ImportError that I just don't quite
> understand.
When that happens, the overwhelmingly most likely cause is that the
set of modules on your PYTHONPATH has changed since the pickle was
first created, in ways such that a module _referenced
[MTD]
> I've been messing about for fun creating a trial division factorizing
> function and I'm naturally interested in optimising it as much as
> possible.
>
> I've been told that iteration in python is generally more
> time-efficient than recursion. Is that true?
Since you heard it from me to b
[EP <[EMAIL PROTECTED]>]
> This inquiry may either turn out to be about the suitability of the
> SHA-1 (160 bit digest) for file identification, the sha function in
> Python ... or about some error in my script
It's your script. Always open binary files in binary mode. It's a
disaster on Windows
[Kay Schluehr]
> You might use a separate prime generator to produce prime factors. The
> factorize algorithm becomes quite simple and configurable by prime
> generators.
Alas, yours was _so_ simple that it always takes time proportional to
the largest prime factor of n (which may be n) instead of
[Matthew Wilson]
> The random.jumpahead documentation says this:
>
> Changed in version 2.3: Instead of jumping to a specific state, n steps
> ahead, jumpahead(n) jumps to another state likely to be separated by
> many steps..
>
> I really want a way to get to the Nth value in a random
[MTD <[EMAIL PROTECTED]>]
> I've been testing my recursive function against your iterative
> function, and yours is generally a quite steady 50% faster on
> factorizing 2**n +/- 1 for 0 < n < 60.
If you're still not skipping multiples of 3, that should account for most of it.
> I think that, for
[Russell Warren]
> I'm guessing no, since it skips down through any Lock semantics,
Good guess :-) It's also unsafe because some internal conditions must
be notified whenever the queue becomes empty (else you risk deadlock).
> but I'm wondering what the best way to clear a Queue is then.
>
> Ese
[Russell Warren]
>>> I'm guessing no, since it skips down through any Lock semantics,
[Tim Peters]
>> Good guess :-) It's also unsafe because some internal conditions must
>> be notified whenever the queue becomes empty (else you risk deadlock).
[Fredrik Lundh]
&g
[j.c.sackett]
> I'm using the threading module to accomplish some distributed processing on
> a project, and have a basic (I hope) question that I can't find an answer to
> elsewhere.
>
> I've noted that there's a lot of documentation saying that there is no
> external way to stop a thread,
True.
[bruce]
> perl has the concept of "die". does python have anything similar. how can a
> python app be stopped?
>
> the docs refer to a sys.stop.
Python docs? Doubt it ;-)
> but i can't find anything else... am i missing something...
>>> import sys
>>> print sys.exit.__doc__
exit([status])
Exit
[Nathan Bates]
> Are the Python developers running Python under Valgrind?
Please read Misc/README.valgrind (in your Python distribution).
--
http://mail.python.org/mailman/listinfo/python-list
[Chandrashekhar kaushik]
> Can an object pickled and saved on a little-endian machine be unpickled
> on a big-endian machine ?
Yes. The pickle format is platform-independent (native endianness
doesn't matter, and neither do the native sizes of C's various integer
types).
> Does python handle thi
[Claudio Grondi]
> I have a 250 Gbyte file (occupies the whole hard drive space)
Then where is Python stored ;-)?
> and want to change only eight bytes in this file at a given offset of appr.
> 200
> Gbyte (all other data in that file should remain unchanged).
>
> How can I do that in Python?
S
a floating point value which was "less than" the
>> value returned by the previous invokation. The computer was a pretty fast
>> one (P4 3Ghz I think, running Windows XP), and this happened only between
>> very close invokations of time.clock().
[Terry Reed]
> I se
[Claudio Grondi]
> Here an example of what I mean
> (Python 2.4.2, IDLE 1.1.2, Windows XP SP2, NTFS file system, 80 GByte
> large file):
>
> >>> f = file('veryBigFile.dat','r')
> >>> f = file('veryBigFile.dat','r+')
>
> Traceback (most recent call last):
>File "", line 1, in -toplevel-
>
[EMAIL PROTECTED]
>> The documentation for PyThreadState_SetAsyncExc says "To prevent naive
>> misuse, you must write your own C extension to call this". Anyone care
>> to list a few examples of such naive misuse?
[and again]
> No? I'll take that then as proof that it's impossible to misuse the
>
[Joachim Durchholz]
>>> Wikipedia says it's going from 2NlogN to N. If a sort is massively
>>> dominated by the comparison, that could give a speedup of up to 100%
>>> (approximately - dropping the logN factor is almost irrelevant, what
>>> counts is losing that factor of 2).
[Gabriel Genellina]
>
7;re interested in.
>>>
>>> If it's asymptotic behavior, then the O(logN) factor is a difference.
>>>
>>> If it's practical speed, a constant factor of 2 is far more relevant
>>> than any O(logN) factor.
[Tim Peters]
>> Nope. Even if you
[/T]
>> OTOH, current versions of Python (and Perl)
[/F]
> just curious, but all this use of (& Perl) mean that the Perl folks have
> implemented timsort ?
A remarkable case of independent harmonic convergence:
http://mail.python.org/pipermail/python-dev/2002-July/026946.html
Come to think
[Aahz]
>> Assuming you're talking about CPython, strings don't really participate
>> in garbage collection. Keep in mind that the primary mechanism for
>> reaping memory is reference counting, and generally as soon as the
>> refcount for an object goes to zero, it gets deleted from memory.
[Les S
[Licheng Fang]
> ...
> I want to know if there is some way to make Python RE behave like grep
> does,
Not in general, no. The matching strategies couldn't be more
different, and that's both deep and intentional. See Friedl's book
for details:
http://regex.info/
> or do I have to change to
[Licheng Fang[
> ...
> In fact, what I'm doing is handle a lot of regular expressions. I
> wanted to build VERY LONG regexps part by part and put them all into a
> file for easy modification and maintenance. The idea is like this:
>
> (*INT) = \d+
> (*DECIMAL) = (*INT)\.(*INT)
> (*FACTION) = (*DECI
[Licheng Fang]
>> Oh, please do have a look at the second link I've posted. There's a
>> table comparing the regexp engines. The engines you've tested probably
>> all use an NFA implementation.
[Bryan Olson]
> Unfortunately, the stuff about NFA's is wrong. Friedl's awful
> book
Strongly disagree:
[Licheng Fang]
>> Basically, the problem is this:
>>
>> >>> p = re.compile("do|dolittle")
>> >>> p.match("dolittle").group()
>> 'do'
...
>> The Python regular expression engine doesn't exaust all the
>> possibilities, but in my application I hope to get the longest possible
>> match, starting fro
[Bryan Olson]
>>> Unfortunately, the stuff about NFA's is wrong. Friedl's awful
>>> book
[Tim Peters]
>> Strongly disagree: [...]
[Bryan]
> I know I'm disagreeing with a lot of smart people in panning
> the book.
That's allowed :-)
>>&
[MRAB]
> Some time after reading about Python 2.5 and how the built-in functions
> 'min' and 'max' will be getting a new 'key' argument, I wondered how
> they would treat those cases where the keys were the same, for example:
>
> L = ["four", "five"]
> print min(L, key = len), max(L, key = len)
>
>
[Marc 'BlackJack' Rintsch]
>> What about:
>>
>> b = array.array('f', a)
[Diez B. Roggisch]
> AFAIK d and f are synonym for arrays, as python doesn't distinguish
> between these two on a type-level. And double it is in the end.
While Python has no type of its own corresponding to the native C
`flo
t;>
>>> L = ["four", "five"]
>>> print min(L, key = len), max(L, key = len)
>>>
>>> The result is:
>>>
>>> ('four', 'four')
[Tim Peters]
>> min() and max() both work left-to-right, and return t
[Tim Peters]
>> [...] The most valuable general technique [Friedl] (eventually ;-)
>> explained he called "unrolling", and consists of writing a regexp in
>> the form:
>>
>>normal* (?: special normal* )*
>>
>> where the sets of character
[Wildemar Wildenburger]
>> I'm thinking of letting my program create hardlinks (or symlinks). I
>> know python allows doing this for ext, reiser and the like, but
>> apparently not for ntfs systems.
>> Is there any package out there that lets me create links in a platform
>> independent way?
[Calv
[Duncan Booth]
>> Windows is only smart enough to avoid duplicate entries if you tell it
>> to do that. e.g.
>>
>> PATH c:\python25;c:\python25\scripts;%PATH:c:\python25;c:\python25\scripts;=%
>>
>> will add the two Python 2.5 folders to the head of the path without
>> duplicating them.
[John Mach
[charlie strauss]
> Below is a simple program that will cause python to intermittently
> stop executing for a few seconds. it's 100% reproducible on my machine.
Any program that creates a great many long-lived container objects
will behave similarly during the creation phase. Others have
explain
[charlie strauss]
>>> Below is a simple program that will cause python to intermittently
>>> stop executing for a few seconds. it's 100% reproducible on my
>>> machine.
[Giovanni Bajo]
>> Confirmed with Python 2.4.2 on Windows.
[Jorgen Grahn]
> And Python 2.3.5 on Linux, amd64. In fact, it caus
[Steve Holden, "pins the blame" for pauses on periodic cyclic gc]
> ...
> So basically what you have here is a pathological example of why it's
> sometimes wise to disable garbage collection. Tim, did I miss anything?
Nope!
--
http://mail.python.org/mailman/listinfo/python-list
[charlie strauss]
> I want to clarify that, on my computer, the first instance of the gap occurs
> way
> before the memory if filled. (at about 20% of physical ram). Additionally the
> process monitor shows no page faults.
Python has no idea of how much RAM you have, or even of how much RAM
it's
[charlie strauss]
> Steve, digging into the gc docs a bit more, I think the behaviour I am seeing
> is still
> not expected. Namely, the program I offered has no obvious place where
> objects
> are deallocated. The way GC is supposed to work is thate there are three
> levels of
> objects
>
> l
[Charlie Strauss]
>>> level0: newly created objects
>>> level1: objects that survived 1 round of garbage collection
>>> level2: objects that survivied 2+ rounds of gargbage collection
>>>
>>> Since all of my numerous objects are level2 objects, and none of
>>> them are every deallocated, then I
[Michael B. Trausch]
>> Let's say that I want to work with the latitude 33.6907570. In Python,
>> that number > can not be stored exactly without the aid of
>> decimal.Decimal().
>>
>> >>> 33.6907570
>> 33.6907568
>> >>>
>>
>> As you can see, it loses accuracy after the 6th decimal place.
[Matt Moriarity]
>> try surrounding your sum argument in brackets:
>>
>> sum([phi(x // ps[i+1], i) for i in range(a)])
>>
>> instead of:
>>
>> sum(phi(x // ps[i+1], i) for i in range(a))
[Michael Press]
> Thank you. That makes it work.
But is a wrong solution ;-) As others have suggested, it's a
[Aahz]
>>> Anyone else getting "Python-related" spam? So far, I've seen messages
>>> "from" Barry Warsaw and Skip Montanaro (although of course header
>>> analysis proves they didn't send it).
[Thomas Heller]
>> I'm getting spam not only from Barry, but also from myself ;-) with
>> forged headers
[Paddy]
>> http://en.wikipedia.org/wiki/Doctest
[Kaz Kylheku]
> I pity the hoplelessly anti-intellectual douche-bag who inflicted this
> undergraduate misfeature upon the programming language.
As a blind misshapen dwarf, I get far too much pity as it is, but I
appreciate your willingness to sha
[Simon Schuster]
> following this tutorial,
Which tutorial?
> I copied and pasted:
>
> from string import *
>
> cds = """atgagtgaacgtctgagcattagctccgtatatcggcgcacaaa
> tttcgggtgccgacctgacgcgcccgttaagcgataatcagtttgaacagctttaccatgcggtg
> ctgcgccatcaggtggtgtttctacgcgatcaagctattacgccgcagcagca
[Bill Atkins]
>> (Why are people from c.l.p calling parentheses "brackets"?)
[Kaz Kylheku]
> Because that's what they are often called outside of the various
> literate fields.
For example, the English are "outside of the various literate fields"?
FWIW, Python documentation consistently uses the
[EMAIL PROTECTED]
> Python dict is a hash table, isn't it?
Yup.
> I know that hashtable has the concept of "bucket size" and "min bucket
> count" stuff,
Some implementations of hash tables do. Python's does not. Python's
uses what's called "open addressing" instead.
> and they should be confi
[Tim Peters]
>> You should also note that copying a dict key or value (no matter of
>> what type) consists in its entirety of copying one machine address (a
>> 4- or 8-byte pointer, depending on platform).
[Lawrence D'Oliveiro]
> Actually, no. It also consists of updatin
[Tim Peters]
>>>> You should also note that copying a dict key or value (no matter of
>>>> what type) consists in its entirety of copying one machine address (a
>>>> 4- or 8-byte pointer, depending on platform).
[Lawrence D'Oliveiro]
>>> Actually,
[Tim Peters]
>> ...
>> Taking my response out of context to begin with doesn't really change
>> that I answered the question he asked ;-)
[Fredrik Lundh]
> welcome to comp.lang.python.
>
>
Thanks for the welcome! It's tough to be a newbie here ;-)
--
ht
[followups set to comp.lang.python]
[Danny]
> I am just getting into OOP using Python and
> because of the floating point of large integer (L)
> square roots all the decimal expansions are truncated.
> This has created a problem because I need these
> decimal expansions in my algorithm other wise
[Rory Campbell-Lange]
>>> Is using the decimal module the best way around this? (I'm
>>> expecting the first sum to match the second). It seem
>>> anachronistic that decimal takes strings as input, though.
[Nick Maclaren]
>> As Dan Bishop says, probably not. The introduction to the decimal
>> mod
[Tim Peters]
...
>|> Well, just about any technical statement can be misleading if not
>|> qualified to such an extent that the only people who can still
>|> understand it knew it to begin with <0.8 wink>. The most dubious
>|> statement here to my eyes is the i
[Fredrik Lundh]
>> ...
>> for the OP's problem, a PIL-based solution would probably be ~100
>> times faster than the array solution, but that's another story.
[Tuvas]
> What do you mean by a PIL based solution? The reason I need to get the
> data into the string list is so I can pump it into PIL t
[Jens Theisen]
> ...
> Actually I'm not sure what this optimisation should give you anyway. The
> only circumstance under which files with only zeroes are meaningful is
> testing, and that's exactly when you don't want that optimisation.
In most cases, a guarantee that reading "uninitialized" file
[rtilley]
> When working with file and dir info recursively on Windows XP. I'm going
> about it like this:
>
> for root, dirs, files in os.walk(path):
> for f in files:
> ADD F to dictionary
> for d in dirs:
> ADD D to dictionary
>
> Is it possible to do something such a
[EMAIL PROTECTED]
> ...
> I work with Guido now and I'm conflicted. I'm still conditioned to say
> tuhple. Whenever he says toople, I just get a smile on my face. I
> think most of the PythonLabs guys pronounce it toople.
"tuhple" is a girly-man affectation. That's why Guido and I both say
the
[EMAIL PROTECTED]
> If I un-comment any line in this program below the line where I
> commented " all OK up to this point " This program locks up my
> computer.
>
> Windows task manager will show "Not Responding" for Python in the
> Applications tab and in the Performance tabe the CPU usage will be
[john peter]
> what happens behind the scenes when i create a Queue.Queue() without
> specifying a maxsize? does a block of space gets allocated initially then
> dynamically "expanded" as needed?
Yes.
> if so, what is the default size of the initial space?
It's initially empty.
> is it always
[Chris McAloney]
> Okay, so I've been working on level seven for a LONG time now.
Hmm. I've been staring at that one 18 hours a day since last Friday,
and still don't have the foggiest idea. I've counted boxes, counted
pixels, broken it apart and rearranged it like a jigsaw puzzle, ran
"strings"
[Tiziano Bettio]
> PLEASE HELP...
>
> What the hell do i have to pronounce in puzzle 5
>
> Some useful hints would be awesome
That's a funny one: I didn't understand the "pronounce it" hint until
long after I solved that one. Then again, Guido & I implemented PEP
307, so I knew what to do th
[Tim Peters, whines about level 7]
[Dan Christensen, gives a huge hint]
The first time I looked at it, I thought "hmm, I should use PIL for
this". I kept thinking that too -- but for some reason wanted
to see if there was a clear way to do it without something that
"fancy"
[Skip Montanaro]
> I understand why the repr() of float("95.895") is "95.8949996".
> What I don't understand is why if I multiply the best approximation to
> 95.895 that the machine has by 1 I magically seem to get the lost
> precision back. To wit:
>
>% python
>Python 2.3.4 (#
[Dan]
>Dan> The floating-point representation of 95.895 is exactly
>Dan> 6748010722917089 * 2**-46.
[Skip Montanaro]
> I seem to recall seeing some way to extract/calculate fp representation from
> Python but can't find it now. I didn't see anything obvious in the
> distribution.
For Da
201 - 300 of 363 matches
Mail list logo