[issue13590] Prebuilt python-2.7.2 binaries for macosx can not compile c extensions
New submission from K Richard Pixley : Install the Python-2.7.2 mac installer for Lion on Lion. Then attempt "easy_install -U psutil". I get: za-dc-dev/bin/easy_install -U psutil install_dir /Users/rich/projects/za-packages/za-dependency-checker/za-dc-dev/lib/python2.7/site-packages/ Searching for psutil Reading http://pypi.python.org/simple/psutil/ Reading http://code.google.com/p/psutil/ Best match: psutil 0.4.0 Downloading http://psutil.googlecode.com/files/psutil-0.4.0.tar.gz Processing psutil-0.4.0.tar.gz Running psutil-0.4.0/setup.py -q bdist_egg --dist-dir /tmp/easy_install-7euim1/psutil-0.4.0/egg-dist-tmp-QRoCe6 unable to execute gcc-4.2: No such file or directory error: Setup script exited with error: command 'gcc-4.2' failed with exit status 1 make: *** [za-dc-dev/lib/python2.7/site-packages/psutil-1.1.2-py2.7.egg] Error 1 There is no binary named "gcc-4.2" on my system. I'm running the latest Xcode, (4.2.1). And gcc in my PATH is a 4.2 binary: r...@fuji-land.noir.com> type gcc gcc is hashed (/usr/bin/gcc) r...@fuji-land.noir.com> gcc --version i686-apple-darwin11-llvm-gcc-4.2 (GCC) 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2336.1.00) Copyright (C) 2007 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. I see no reference to "gcc-4.2" in psutils source nor in distutils. From this I guess that the python configuration is looking for the same compiler that was used to produce the package, (presumably on osx-10.6). Other developers tell me that they have a "gcc-4.2" on osx-10.6. And indeed, downloading and building python-2.7.2 from source results in a python that can download and compile psutil. -- assignee: ronaldoussoren components: Extension Modules, Macintosh messages: 149356 nosy: ronaldoussoren, teamnoir priority: normal severity: normal status: open title: Prebuilt python-2.7.2 binaries for macosx can not compile c extensions type: compile error versions: Python 2.7 ___ Python tracker <http://bugs.python.org/issue13590> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue13749] socketserver can't stop
New submission from K Richard Pixley : Once I've instantiated my server class, along with a handler class, called server.serve_forever(), handler.handle() has been called, I've done my work, and I'm ready to shut the whole thing down... How do I do that? The doc says server.shutdown(), but if I call self.server.shutdown() from within handler.handle(), I seem to get a deadlock, which is exactly what I'd expect in a single threaded system with no way to "signal" the server.server_forever() loop which is several frames up the stack. I've also tried sys.exit() but it seems that the server object is catching that as an exception. How is this expected to work? How do I terminate the server.serve_forever() loop? Both 3.2 and 2.7 appear to behave the same way. The documentation is confusing here as it doesn't explain what is expected to happen in this case. -- components: Library (Lib) messages: 150965 nosy: teamnoir priority: normal severity: normal status: open title: socketserver can't stop type: behavior versions: Python 2.7, Python 3.2 ___ Python tracker <http://bugs.python.org/issue13749> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue13749] socketserver can't stop
K Richard Pixley added the comment: It appears as though the problem is that shutdown() blocks waiting for the serve_forever loop to terminate, which won't happen as long as the process is blocked on shutdown. I'd like to propose that the library be changed to eliminate the block. Shutdown() can set the flag and then return. This should allow the handler to return and the serve_forever loop to notice that it has been asked to cease operations. Failing that, I think the library needs some other way to exit a socketserver. -- ___ Python tracker <http://bugs.python.org/issue13749> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue13749] socketserver can't stop
K Richard Pixley added the comment: On second thought, my proposal is likely to break existing code, so I withdraw it. I don't know how to exit the server in a way that both works in all conditions and also continues to support existing semantics. I expect we'll need to create a new call. Perhaps "request_shutdown" which simply sets the flag without waiting? -- ___ Python tracker <http://bugs.python.org/issue13749> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11203] gzip doc is behind
K Richard Pixley added the comment: My point was for python-2.7. I haven't stumbled into the buffer protocol yet. So no, it doesn't really. I still think the documentation, especially the 2.7 doc, could be more explicit. My concern here is with the use of close() becoming obscure, a second class citizen, or an afterthought. While I greatly appreciate the context manager, there are times when I want an enduring open channel for which the context manager just isn't appropriate. Even in a world with context manager, open and close need to be available and presented as a pair. It isn't clear to me from reading the doc or looking at the examples that gzip is expected to support a close call. Yes, I concur that there is an implication, but I would prefer to see it stated explicitly along with the explicit statement that it supports an open call. -- ___ Python tracker <http://bugs.python.org/issue11203> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11203] gzip doc is behind
K Richard Pixley added the comment: An interesting point, although I think that's only relevant if the documentation lists the ABC and a reference to it. (python-3 doc essentially does this.) I see no such reference in the 2.7 gzipfile doc, which leads me to believe, (from the doc alone), that it's an independent implementation of a "file like object". This may not be important enough to even merit the time we've already put into it. Please feel free to close this ticket without change if you prefer. -- ___ Python tracker <http://bugs.python.org/issue11203> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11203] gzip doc is behind
K Richard Pixley added the comment: I didn't miss it. I think the close call needs equal treatment to the open call. The mention is certainly present, but seems implicit to me. I would prefer to see it listed explicitly. But I also don't think it's important enough in the 2.7 docs to discuss it much further. You've convinced me that it's not worth fixing. Let's drop it. -- ___ Python tracker <http://bugs.python.org/issue11203> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11203] gzip doc is behind
K Richard Pixley added the comment: I'm now convinced this isn't worth fixing in 2.x. -- resolution: -> wont fix status: open -> closed ___ Python tracker <http://bugs.python.org/issue11203> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue12021] mmap.read requires an argument
New submission from K Richard Pixley : mmap.read requires a argument. Since most file-like objects do not, this breaks the file-like object illusion. mmap.read argument should be optional, presumably defaulting to the entire mmap'd area. -- messages: 135362 nosy: rich-noir priority: normal severity: normal status: open title: mmap.read requires an argument type: behavior versions: Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3 ___ Python tracker <http://bugs.python.org/issue12021> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue4489] shutil.rmtree is vulnerable to a symlink attack
K Richard Pixley added the comment: How does "rm -rf" address this issue? Or does it? shutils.rmtree should probably do the same thing. -- nosy: +teamnoir ___ Python tracker <http://bugs.python.org/issue4489> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue3860] GzipFile and BZ2File should support context manager protocol
K Richard Pixley added the comment: Documentation needs to be updated to state that these are now context managers. This is important since they aren't in python-2.x. I'm not sure whether this should be added to the "new in python" blurbs. -- nosy: +teamnoir ___ Python tracker <http://bugs.python.org/issue3860> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11203] gzip doc is behind
New submission from K Richard Pixley : The documentation for gzip should include the "close" method. It's use in the 2.7 documentation implies it's existence but it should also be stated explicitly that it exists. In the 3.x documentation, the use of "close" not in the examples since the examples use context manager. For 3.x documentation there should be both an explicit mention of the "close" method as well as an explicit mention that GzipFile supports the context manager protocol. Yes, the use of the context manager in the examples implies that this is true but documentation on other modules states so explicitly so this module should too. -- assignee: docs@python components: Documentation messages: 128460 nosy: docs@python, teamnoir priority: normal severity: normal status: open title: gzip doc is behind type: feature request versions: Python 2.7, Python 3.3 ___ Python tracker <http://bugs.python.org/issue11203> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue13590] extension module builds fail with python.org OS X installers on OS X 10.7 and 10.6 with Xcode 4.2
K Richard Pixley added the comment: I think a better solution that declaring it to be apple's bug would be to release one binary for pre-10.7, (or maybe 10.6 with the current xcode), and a different binary for post-10.7. This isn't an apple "bug" in the sense that there's anything wrong nor in the sense that they would ever "fix" it. It's simply a difference between xcode versions. So the choices would seem to be a) code around it or b) release different binaries. I'm ok with either solution. I'm not sure what would be best as I'm not sure I know all of the concerns involved. -- ___ Python tracker <http://bugs.python.org/issue13590> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18744] pathological performance using tarfile
New submission from K Richard Pixley: There's a problem with tarfile. Write a program to traverse the contents of a modest sized tar archive. Make sure your tar archive is compressed. Then read the tar archive with your program. I'm finding that allowing tarfile to read a compressed archive costs me somewhere on the order of a 60x performance penalty by comparison to opening the file with gzip, then passing the gzip contents to tarfile. Programs that could take a few minutes are literally taking a few hours when using tarfile. This seems stupid. The tarfile library could do the same thing I'm doing manually, in fact, I had assumed that it would and was surprised by the performance I was seeing, so I ran with the profiler and saw millions of decompression calls. It's almost as though the tarfile library is decompressing the entire archive for every member extraction. Note, you can get even worse performance if you sort the member names and then extract in that order. I'm not sure whether this "should" matter since the tar file order is sequential. -- components: Library (Lib) messages: 195232 nosy: teamnoir priority: normal severity: normal status: open title: pathological performance using tarfile type: performance versions: Python 2.7 ___ Python tracker <http://bugs.python.org/issue18744> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18744] pathological performance using tarfile
K Richard Pixley added the comment: New info... I see the degradation on most of the linux boxes I've tried: * ubuntu-13.04, (raring), 64-bit * rhel-5.4 64-bit * rhel-5.7 64-bit * suse-11 64-bit I see some degradation on MacOsX-10.8.4 but it's in the acceptable range, more like 2x than 60x. That is still suspicious, but not as problematic. -- ___ Python tracker <http://bugs.python.org/issue18744> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18744] pathological performance using tarfile
K Richard Pixley added the comment: Here's a script that tests for the problem. -- Added file: http://bugs.python.org/file31303/tarproblem.py ___ Python tracker <http://bugs.python.org/issue18744> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18744] pathological performance using tarfile
K Richard Pixley added the comment: I see your point. The alternative would be to limit the size of archive that can be extracted from to the size of virtual memory, which is essentially what I'm doing manually. Either way, someone will be surprised. I'm not which which way will result in the least surprise since I suspect that far more people will be extracting from compressed archives than will be extracting very large archives. The failure mode with limited file size seems much less frequent but also much more annoying. In comparison, the failure, (and the pathological case is effectively a failure), reading compressed archives seems much more common to me, although granted, not completely a total failure. I think this should be mentioned in the doc because I, at least, was extremely surprised by this behavior and it cost me some time to track it down. I might suggest something along the lines of: Be careful when working with compressed archives. In order to support the largest file sizes possible, some approaches may result in pathological behavior causing the original archive to be decompressed, in full, many times. You should be able to avoid this behavior if you traverse the TarInfo items in file order. You might also consider decompressing the archive first, in memory, and then handing the memory copy to tarfile for processing. -- status: pending -> open ___ Python tracker <http://bugs.python.org/issue18744> ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com