[Python-Dev] Wanting to learn

2005-09-10 Thread Jason
Hi My name is Jason & i have a great interest in progamming whether it
be python or what have you. From my understanding Python is written in C
right ? I am willing to do grunt work just to learn .I a quick to catch
on given the right path to follow.Please let me know if you  will let me
learn help in my endeavor to learn to program. I am eager to hear
back .Thanks for your time.Or if not Maybe you could point me in the
right direction



Jason 

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Python 3.3.4150

2014-02-28 Thread Burgoon, Jason
Good day Python Dev Team -

One of our users has reported the following:

I have installed the given msi on 64 bit as per the install instructions 
document.
One of the shortcuts 'Start Menu\Programs\Python 3.3\ Module Docs' is not 
getting launched. When I launch this shortcut, it is not opening any window.
I have tried with admin user and non-admin user.

Is this expected behavior?

Please advise and thanks for your help.

Jason

[http://intranet.ds.global/technology/employees/Documents/Communication%20Resource%20Center/Small%20logo%20for%20signature.png]
Jason Burgoon
Application Management Services - Coordinator
4333 Edgewood Rd NE, Cedar Rapids, IA 52499
Phone: (319) 355-4534
Email: [email protected]<mailto:[email protected]>
Click here<http://intranet.ds.global/technology/Pages/default.aspx> to visit 
the AGT Website


<>___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Another update for PEP 394 -- The "python" Command on Unix-Like Systems

2019-02-13 Thread Jason Swails
On Wed, Feb 13, 2019 at 10:26 AM Petr Viktorin  wrote:

> PEP 394 says:
>
>  > This recommendation will be periodically reviewed over the next few
>  > years, and updated when the core development team judges it
>  > appropriate. As a point of reference, regular maintenance releases
>  > for the Python 2.7 series will continue until at least 2020.
>
> I think it's time for another review.
> I'm especially worried about the implication of these:
>
> - If the `python` command is installed, it should invoke the same
>version of Python as the `python2` command
> - scripts that are deliberately written to be source compatible
>with both Python 2.x and 3.x [...] may continue to use `python` on
>their shebang line.
>
> So, to support scripts that adhere to the recommendation, Python 2
> needs to be installed :(
>

I literally just ran into this problem now.  Part of a software suite I've
written uses Python to fetch updates during the installation process.  Due
to the target audience, it needs to access the system Python (only), and
support systems as old as RHEL 5 (Python 2.4 and later, including Python
3.x in the same code base, using nothing but the stdlib).  The shebang line
was "#!/usr/bin/env python"

It's been working for years, but was only now reported broken by a user
that upgraded their Ubuntu distribution and suddenly had no "python"
executable anywhere.  But they had python3.

I suspect suddenly not having any "python" executable in a Linux system
will screw up a lot more people than just me.  The workaround was ugly.

I'd like to see there always be a `python` executable available if any
version of Python is installed.

Thanks,
Jason

-- 
Jason M. Swails
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Another update for PEP 394 -- The "python" Command on Unix-Like Systems

2019-02-14 Thread Jason Swails
 

> On Feb 14, 2019, at 3:44 AM, Antoine Pitrou  wrote:
> 
> On Thu, 14 Feb 2019 00:57:36 -0500
> Jason Swails  wrote:
>> 
>> I literally just ran into this problem now.  Part of a software suite I've
>> written uses Python to fetch updates during the installation process.  Due
>> to the target audience, it needs to access the system Python (only), and
>> support systems as old as RHEL 5 (Python 2.4 and later, including Python
>> 3.x in the same code base, using nothing but the stdlib).  The shebang line
>> was "#!/usr/bin/env python"
>> 
>> It's been working for years, but was only now reported broken by a user
>> that upgraded their Ubuntu distribution and suddenly had no "python"
>> executable anywhere.  But they had python3.
>> 
>> I suspect suddenly not having any "python" executable in a Linux system
>> will screw up a lot more people than just me.  The workaround was ugly.
> 
> I'm not sure what you mean.  Isn't the workaround to install Python 2
> in this case?

I release the software, so the problem is not my machine, it’s others’. The 
installation process also fetches a local miniconda distribution for the Python 
utilities that are part of the program suite (and the python programs are 
optional and typically not installed when this suite is deployed on a 
supercomputer, for instance). But the software needs to check for updates 
before it does any of that (hence my concern — this script needs to be able to 
run before the user does *anything* else, including installing dependencies). 

This would also be the first time we’d have to give different installation 
instructions for different versions of the same Linux distro. 

The workaround from a users perspective is simple for me, but I can’t make that 
same assumption for all of my users. This is an impediment to keeping the user 
experience as simple as possible. 

Thanks,
Jason
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] __getattribute__'s error is not available in __getattr__

2017-05-01 Thread Jason Maldonis
Hi everyone,

If this should be asked in learn python I apologize -- please just tell me
without answering.

I'm working on a large class architecture and I find myself often
overloading __getattr__.  I am continuously running into the issue where I
want __getattr__ to have access to the error that was raised in
__getattribute__, but it seems completely unavailable. Is that true?

One simple case that I'm guessing others have run into, is if __getattr__
fails, the error from __getattribute__ isn't in the stack trace that gets
printed to screen.  To fix this (on occasion) I'll even re-call
__getattribute__ within __getattr__ just to get the error so I can properly
"raise from" the __getattibute__'s error -- although that's probably bad
practice in general.

I'd like to be able to access the error that was raised in __getattribute__
when __getattr__ is called.

Two more quick context comments: python is awesome, thank you all for your
hard work; and I've been writing python almost every day for ~ 5 years now
and I can do all the "black magic" jazz, so I'll be okay with an
implementation that requires that type of stuff if necessary.

Thanks!
Jason
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: Make __mro_entries__ mandatory for non-types

2022-03-05 Thread Jason Madden
Steven D'Aprano wrote:
> On Sat, Mar 05, 2022 at 11:27:44AM +0200, Serhiy Storchaka wrote:
> > Currently the class can inherit from arbitrary objects, not only types.
> > Is that intentionally supported?
> I know that metaclasses do not have to be actual classes, they can be 
> any callable with the correct signature, but I didn't know that classes 
> can inherit from non-classes.

zope.interface relies on this behaviour.

py> from zope.interface import Interface
py> class IFoo(Interface):
... def a_method():
..."""Does stuff"""
...
py> type(Interface)

py> type(IFoo)

py> isinstance(IFoo, type)
False
py> isinstance(Interface, type)
False
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/EOEFTHD4POWRCT7FGBE5WT6IZGZNG2LX/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Make __mro_entries__ mandatory for non-types

2022-03-05 Thread Jason Madden
Steven D'Aprano wrote:
> On Sat, Mar 05, 2022 at 04:42:55PM -0000, Jason Madden wrote:
> > zope.interface relies on this behaviour.
> > The example you give shows that Interface is a class. It merely has a 
> metaclass which is not `type`. (I presume that is what's going on 
> behind the scenes.)

I don't think that's *quite* correct. `Interface` is an ordinary object, an 
instance of `InterfaceClass`. `InterfaceClass` despite the name, is not 
actually a metaclass — `type` isn't anywhere in the MRO. The original example 
is similar in that 1 is an instance of `int` and `int` isn't a metaclass either.

py> InterfaceClass.__mro__
(,
 ,
 ,
 ,
 ,
 ,
 ,
 )

The simplified version of what zope.interface is doing is this:

py> class SomethingBase:
... def __init__(self, name, bases, ns):
... pass
py> Something = SomethingBase('Something', (), {})
py> class MySomething(Something):
... pass
py> MySomething
<__main__.SomethingBase object at 0x100c81910>
py> isinstance(MySomething, type)
False

Contrast with a true metaclass:

py> class Meta(type):
... pass
py> class WithMeta(metaclass=Meta):
... pass
py> type(WithMeta)

py> type(WithMeta).__mro__
(, , )

> I'm asking about the example that Serhiy shows, where a class inherits 
> from something which is not a class at all. In his example, the base is 
> 1, although it gives a TypeError. I'm asking if that sort of thing is 
> suppposed to work, and if so, how?

Python accepts anything callable with the right number of arguments as a 
metaclass. If you specify it explicitly, it can be just a function:

py> def Meta(*ignored):
... return 42
...
py> class X(metaclass=Meta):
... pass
...
py> X
42

If you don't specify it explicitly, Python uses the type of the first base as 
the metaclass (ignoring conflicts and derived metaclasses) and calls that. In 
neither case does it have to be, or return, a class/type. We can actually write 
a class statement using a number as a base if we  define an appropriate 
`__new__` method:

py> class MyInt(int):
... def __new__(cls, *args):
... if len(args) == 1:
...return int.__new__(cls, *args)
... return int.__new__(cls, 42)
py> one = MyInt(1)
py> one
1
py> isinstance(one, int)
True
py> isinstance(one, type)
False
py> class A(one):
... pass
...
py> A
42
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/2WV6GXDQLRA4PYQHZLEKZUIAOH3EYKNF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Restricted Entry Point from PEP-551/578

2019-11-21 Thread Jason Killen
I sent in a couple of PRs, accepted and merged (Thanks!), lately that
switch to using io.open_code when appropriate.  In the process of making
those PRs I spent a bit of time reading the two related PEPs.  In PEP-551
there's a suggestion that people use a restricted entry point in production
environments.  I googled around a bit and couldn't find any evidence that
there was an existing implementation, at least not made public, that people
were using in a general sense.  So I created a branch from my fork and over
the last few days have implemented part of what's suggested in PEP-551.
Specifically my changes remove most of the command line options, ignore
envvars (except for a possible logging filename for the audit hooks), and
registers an audit hook that logs everything to the defined envvar when
provided or stderr if not.

Now the questions:
1) Does anybody care?  Is anyone currently doing this or planning on doing
this?
2) Do we want to provide an "official" version of a restricted entry point
that could be used as-is or easily modified per specific needs?  Seems
kinda silly to make everyone roll their own version but I'm happy to yield
to the will of the people.
3) What's the chance we wanna merge something like this into the official
master branch?  I accomplished what I wanted to do using a few #ifdef's and
some funky makefile magic.  I think it would merge easily.  Maintaining a
fork sounds like a lot of work to me.

And here's the code:
I'm very open to suggestions.  I basically have no idea what I'm doing.
I haven't touched C in about 7 years so don't expect the Mona Lisa.
https://github.com/python/cpython/compare/master...jsnklln:PEP551_restricted_entry_point

-- 
Jason Killen   [email protected]
Pain is inevitable, misery is optional.
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/KYEKDEGSI5B7BSY7XCJ36TE6XCBM2GML/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Restricted Entry Point from PEP-551/578

2019-11-21 Thread Jason Killen
Thanks.  Yea if you bring all the backchannels together at some point I
would like to be included mostly as a listener.  I totally agree about not
wanting people to think we've "solved" security.  I expected that might be
the reason this hadn't been done.  On the other hand I think some sort of
starting point or examples are very useful so we don't all reinvent the
wheel.  Your spython branch seems to be what I was looking for but couldn't
find.  I'm gonna clone it and poke around a bit.  How can we make your fork
easier to find?  Googling for spython doesn't even get you in the right
neighborhood.  Can we include a link in the PEP?

On Thu, Nov 21, 2019 at 1:09 PM Steve Dower  wrote:

> On 21Nov2019 0927, Jason Killen wrote:
> > I sent in a couple of PRs, accepted and merged (Thanks!), lately that
> > switch to using io.open_code when appropriate.  In the process of making
> > those PRs I spent a bit of time reading the two related PEPs.  In
> > PEP-551 there's a suggestion that people use a restricted entry point in
> > production environments.  I googled around a bit and couldn't find any
> > evidence that there was an existing implementation, at least not made
> > public, that people were using in a general sense.  So I created a
> > branch from my fork and over the last few days have implemented part of
> > what's suggested in PEP-551.  Specifically my changes remove most of the
> > command line options, ignore envvars (except for a possible logging
> > filename for the audit hooks), and registers an audit hook that logs
> > everything to the defined envvar when provided or stderr if not.
>
> Hi Jason. Great to see that you're interested!
>
> Right now there isn't an existing implementation, mainly because the
> value is very limited unless you also integrate into other system
> security services. Since CPython runs on so many platforms, there was no
> way for us to support all the possible combinations upstream, so instead
> we made it easy to extend so that distributors can customise it for
> their own platform (including sysadmins who want to customise for their
> own internal setups).
>
> You can watch the presentations I gave on this recently (top few links
> at https://stevedower.id.au/speaking), or read my whitepaper at
> https://aka.ms/sys.audit (which will be rolled into PEP 551 soon).
>
> I have also some samples at https://github.com/zooba/spython (that are
> shown and discussed in the whitepaper).
>
> > Now the questions:
> > 1) Does anybody care?  Is anyone currently doing this or planning on
> > doing this?
>
> Yes, quite a few people care. It's mostly being discussed with me on
> backchannels right now though, and I haven't even connected everyone
> together yet. If you're interested, I'll include you when I do?
>
> > 2) Do we want to provide an "official" version of a restricted entry
> > point that could be used as-is or easily modified per specific needs?
> > Seems kinda silly to make everyone roll their own version but I'm happy
> > to yield to the will of the people.
>
> As I said above, an "official" one can only go so far. I'd prefer to see
> the Linux distros have their own official one (e.g. if I'm on RHEL then
> I use Red Hat's restricted entry point, etc.)
>
> > 3) What's the chance we wanna merge something like this into the
> > official master branch?  I accomplished what I wanted to do using a few
> > #ifdef's and some funky makefile magic.  I think it would merge easily.
> > Maintaining a fork sounds like a lot of work to me.
>
> The fork is very minimal now that the hooks are available in core. I've
> been involved in maintaining a true fork (based on 3.6 without the
> hooks) and *that* is a lot of work, but now that it can all be done from
> the entry-point it's actually pretty simple.
>
> > And here's the code:
> > I'm very open to suggestions.  I basically have no idea what I'm doing.
> > I haven't touched C in about 7 years so don't expect the Mona Lisa.
> >
> https://github.com/python/cpython/compare/master...jsnklln:PEP551_restricted_entry_point
>
> I think this looks almost exactly like what we would merge if we were
> going to merge anything. My concern is that I think if we offer anything
> at all it will discourage people/distros from actually implementing it
> properly for their context, and so we make things worse. Making it easy
> to extend without actually doing it seems like a better place to be.
>
> And I'm totally in favour of publishing ready-to-build samples (again,
> see https://g

[Python-Dev] Re: Restricted Entry Point from PEP-551/578

2019-11-21 Thread Jason Killen
I knew the audit hooks were new but didn't realize they were quite that
new.  I didn't mean to come across as pejorative asking if people cared
about this.  The fact that I had trouble finding more information made me
think this good stuff had been left by the wayside.  It's new, I'll to pump
the brakes a little and let things take their course.

I wasn't aware of the configuration system for 3.8.  I'll look into that.
When I was in there poking around I kept wondering why we didn't have a
fancy configuration system.  The good news is we do I just didn't know
about it.

You're completely right about my implementation logging to a file.  I fully
admit my code wasn't camera ready.  Once I learned that the hooks existed I
wanted to give them a try.  Once I got it working I wanted to get some
feedback.

Thank you for the feedback.  I'm glad somebody is thinking about these
types of things and I'd love to help if I can.

On Thu, Nov 21, 2019 at 2:49 PM Christian Heimes 
wrote:

> On 21/11/2019 18.27, Jason Killen wrote:
> > I sent in a couple of PRs, accepted and merged (Thanks!), lately that
> > switch to using io.open_code when appropriate.  In the process of making
> > those PRs I spent a bit of time reading the two related PEPs.  In
> > PEP-551 there's a suggestion that people use a restricted entry point in
> > production environments.  I googled around a bit and couldn't find any
> > evidence that there was an existing implementation, at least not made
> > public, that people were using in a general sense.  So I created a
> > branch from my fork and over the last few days have implemented part of
> > what's suggested in PEP-551.  Specifically my changes remove most of the
> > command line options, ignore envvars (except for a possible logging
> > filename for the audit hooks), and registers an audit hook that logs
> > everything to the defined envvar when provided or stderr if not.
>
> As Steve already pointed out, https://github.com/zooba/spython contains
> examples examples for Windows and Linux. The linux_xattr is a PoC that I
> developed for our joined talk at EuroPython 2019.
>
> > Now the questions:
> > 1) Does anybody care?  Is anyone currently doing this or planning on
> > doing this?
> > 2) Do we want to provide an "official" version of a restricted entry
> > point that could be used as-is or easily modified per specific needs?
> > Seems kinda silly to make everyone roll their own version but I'm happy
> > to yield to the will of the people.
> > 3) What's the chance we wanna merge something like this into the
> > official master branch?  I accomplished what I wanted to do using a few
> > #ifdef's and some funky makefile magic.  I think it would merge easily.
> > Maintaining a fork sounds like a lot of work to me.
>
> Yes, we still care. Steve has been more active on the Windows side than
> me on the Linux side. Please keep in mind that the feature is new to
> Python 3.8 and 3.8.0 just came out a little more than a month ago. It's
> going to take a while until major vendors will make use of the feature.
>
> PEP 578 features require distribution and vendor-specific adjustments.
> For example it makes sense to disable byte code compilation and usage on
> Windows. On the other hand RPM based platforms like RHEL always want to
> use precompiled byte code because PYC files are shipped with RPM packages.
>
> > And here's the code:
> > I'm very open to suggestions.  I basically have no idea what I'm doing.
> > I haven't touched C in about 7 years so don't expect the Mona Lisa.
> >
> https://github.com/python/cpython/compare/master...jsnklln:PEP551_restricted_entry_point
>
> Thanks for your work! Prototypes like this are useful to figure out how
> Python's configuration should be improved. Eventually it should not be
> necessary to modify any code in CPython to create an spython
> interpreter. I like to reach a state in which it is possible to
> configure all these flags by providing a custom configuration. Victor's
> new config system in 3.8 can accomplish almost everything. It is
> currently not possible to modify argument parsing, though.
>
> One comment about your implementation:
> You are logging to a file. It's not very useful to log to a target in
> the same security context except for debugging. If an attacker is able
> to compromise the interpreter, then the attacker most likely gains
> enough privileges to wipe the file. That's why Steve uses the Windows
> event log in his examples and I'm going for syslog and journald. These
> logging system run as a different user and log fil

[Python-Dev] Re: Restricted Entry Point from PEP-551/578

2019-11-21 Thread Jason Killen
I'm good, not discouraged.  Thank you for the explanation I've got my
bearings now.  I will try and figure out what's missing with the new config
system.  If you have tips or reading material or anything else I should
know just send it on otherwise I'll start googling.

On Thu, Nov 21, 2019 at 4:30 PM Christian Heimes 
wrote:

> On 21/11/2019 21.19, Jason Killen wrote:
> > I knew the audit hooks were new but didn't realize they were quite that
> > new.  I didn't mean to come across as pejorative asking if people cared
> > about this.  The fact that I had trouble finding more information made
> > me think this good stuff had been left by the wayside.  It's new, I'll
> > to pump the brakes a little and let things take their course.
> >
> > I wasn't aware of the configuration system for 3.8.  I'll look into
> > that.  When I was in there poking around I kept wondering why we didn't
> > have a fancy configuration system.  The good news is we do I just didn't
> > know about it.
> >
> > You're completely right about my implementation logging to a file.  I
> > fully admit my code wasn't camera ready.  Once I learned that the hooks
> > existed I wanted to give them a try.  Once I got it working I wanted to
> > get some feedback.
> >
> > Thank you for the feedback.  I'm glad somebody is thinking about these
> > types of things and I'd love to help if I can.
>
> I'm sorry, I wasn't trying to stop you or discourage you in any way. It
> was my intent to provide an explanation why there is not much adoption
> of the new hooks yet. Please go ahead, put the paddle to the metal and
> play around with the new features!
>
> For example you could look into the new config system and figure out
> what is missing to build an spython interpreter.
>
> Christian
> ___
> Python-Dev mailing list -- [email protected]
> To unsubscribe send an email to [email protected]
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/[email protected]/message/5KKLTIKNY37GK4ZK4DJGGOHAPZFQHGN5/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
Jason Killen   [email protected]
Pain is inevitable, misery is optional.
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/6E3CKBZXKFRXZLRD5SCAWBERUYBUP3QI/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Restricted Entry Point from PEP-551/578

2019-11-22 Thread Jason Killen
I did a quick hack up of letting configs control what command line options
were available.  I'm not sure y'all wanted it but here it is.  I'm happy to
take suggestions including tossing the whole thing and chalking it up to
experience.
It's lightly tested with the syslog implementation from the spython repo.

https://github.com/python/cpython/compare/master...jsnklln:cmdline_options_controled_by_config

On Thu, Nov 21, 2019 at 9:36 PM Terry Reedy  wrote:

> On 11/21/2019 4:46 PM, Steve Dower wrote:
> > (though some won't be raised until 3.8.1... we should probably mark
> > those, or at least update that page to warn that events may have been
> > added over time).
>
> I included this in a new audit doc issue.
> https://bugs.python.org/issue38892
>
> --
> Terry Jan Reedy
> ___
> Python-Dev mailing list -- [email protected]
> To unsubscribe send an email to [email protected]
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/[email protected]/message/OBHSBXUMM6EML35MY76QPAP7PQVNEHBY/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
Jason Killen   [email protected]
Pain is inevitable, misery is optional.
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/XMJWSYDFOLGCIMIODRXJQ47G42B55TN5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] 2.6.3 unittest change breaks nose (issue 6418)

2009-07-05 Thread jason pellerin
Bringing python-dev into the discussion at Barry's request. The
summary is that a recent change to unittest.TestProgram breaks nose by
moving self.testRunner initialization from it's old home in
TestProgram.runTests to TestProgram.__init__. The very small patch
attached to the ticket moves it back to runTests.

Here's the ticket: http://bugs.python.org/issue6418
And a link to the testing in python list discussion:

http://lists.idyll.org/pipermail/testing-in-python/2009-July/002032.html

JP (primary author of nose)


PGP.sig
Description: PGP signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PyPI comments and ratings, *really*?

2009-11-12 Thread Jason Baker
On Thu, Nov 12, 2009 at 8:06 AM, Jesse Noller  wrote:
> On Thu, Nov 12, 2009 at 8:38 AM, Steven D'Aprano  wrote:
>> On Thu, 12 Nov 2009 08:44:32 pm Ludvig Ericson wrote:
>>> Why are there comments on PyPI? Moreso, why are there comments which
>>> I cannot control as a package author on my very own packages? That's
>>> just absurd.
>>
>> No, what's absurd is thinking that the act of publishing software
>> somehow gives you the right to demand control over what others say
>> about your software.
>>
>> I don't suppose that this rant of yours has something to do with the
>> comment posted today?
>
> Frankly, I agree with him. As implemented, I *and others* think this
> is broken. I've taken the stance of not publishing things to PyPi
> until A> I find the time to contribute to make it better or B> It
> changes.

I'm not sure I see the utility of ratings, but I think comments can be
useful as long as they don't carry over from release to release.  For
instance, suppose there's a bug in my package and someone leaves a
comment about it.  I don't want that comment still hanging around 3
years after I've already fixed the bug.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PyPI comments and ratings, *really*?

2009-11-12 Thread Jason Baker
On Thu, Nov 12, 2009 at 10:19 AM, Antoine Pitrou  wrote:
> (more seriously, the problem with a comment system is that once it takes off,
> you need a whole array of functionalities to maintain a good S/N ratio. Just
> allowing people to "comment" without any sort of moderation, filtering or
> community building doesn't work)

Why not allow ratings on comments as well?
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PyPI governance

2009-11-13 Thread Jason Baker
On Fri, Nov 13, 2009 at 6:44 PM, Chris Withers  wrote:
> PS: While I'm sure a lot of python-dev people are interested in this topic,
> I'm pretty sure this whole huge sprawling thread belongs on catalog-sig...

+100
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Distutils] At least one package management tool for 2.7

2010-03-24 Thread Jason Baker
On Wed, Mar 24, 2010 at 12:53 PM, Darren Dale  wrote:

> On Wed, Mar 24, 2010 at 1:19 PM, Ian Bicking  wrote:
> > On Wed, Mar 24, 2010 at 7:27 AM, Olemis Lang  wrote:
> >> My experience is that only `install_requires` is needed (unless you
> >> want to create app bundles AFAICR) , but in practice I've noticed that
> >> *some* easy_installable packages are not pip-able (though I had no
> >> time to figure out why :-/ )
> >
> > Usually this is because Setuptools is poking at objects to do its
> > work, while pip tries to work mostly with subprocesses.  Though to
> > complicate things a bit, pip makes sure the Setuptools monkeypatches
> > to distutils are applied, so that it's always as though the setup.py
> > says "from setuptools import setup".  easy_install *also* does this.
> >
> > But then easy_install starts calling methods and whatnot, while pip just
> does:
> >
> >  setup.py install --single-version-externally-managed --no-deps
> > --record some_tmp_file
> >
> > The --no-deps keeps Setuptools from resolving dependencies
>
> Seeking clarification: how can pip recursively install dependencies
> *and* keep Setuptools from resolving dependencies?
>
>
Using the --no-deps option to setup.py
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Document performance requirements?

2006-07-21 Thread Jason Orendorff
On 7/21/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> However, I'm also struggling to think of a case other than list vs deque where
> the choice of a builtin or standard library data structure would be dictated
> by big-O() concerns.

OK, but that doesn't mean the information is unimportant.  +1 on
making this something of a priority.  People looking for this info
should find it in the obvious place.  Some are unobvious. (How fast is
dict.__eq__ on average? Worst case?)

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Caching float(0.0)

2006-09-29 Thread Jason Orendorff
On 9/29/06, Fredrik Lundh <[EMAIL PROTECTED]> wrote:
> (I just checked the program I'm working on, and my analysis tells me
> that the most common floating point value in that program is 121.216,
> which occurs 32 times.  from what I can tell, 0.0 isn't used at all.)

*bemused look*  Fredrik, can you share the reason why this number
occurs 32 times in this program?  I don't mean to imply anything by
that; it just sounds like it might be a fun story.  :)

Anyway, this kind of static analysis is probably more entertaining
than relevant.  For your enjoyment, the most-used float literals in
python25\Lib, omitting test directories, are:

1e-006: 5 hits
4.0: 6 hits
0.05: 7 hits
6.0: 8 hits
0.5: 13 hits
2.0: 25 hits
0.0: 36 hits
1.0: 62 hits

There are two hits each for -1.0 and -0.5.

In my own Python code, I don't even have enough float literals to bother with.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 355 status

2006-10-02 Thread Jason Orendorff
On 9/30/06, Giovanni Bajo <[EMAIL PROTECTED]> wrote:
> Guido van Rossum wrote:
> > OK. Pronouncement: PEP 355 is dead. [...]
>
> It would be terrific if you gave us some clue about what is
> wrong in PEP355, [...]

Here are my guesses.  I believe Guido rejected this PEP for a lot of reasons.

By the way, what I'm about to do is known as "channeling Guido
(badly)" and I'm pretty sure it annoys him.  Sorry, Guido.  Please
don't treat the following as authoritative; I have never met Guido and
obviously I cannot speak for him.

- I don't think Guido ever saw much benefit from "path objects".  That
is, the Motivation was not compelling.  I think the main motivation is
to eliminate some clutter and add a handful of useful methods to the
stdlib, so it's easy to see how this could be the case.

- Guido just flat-out didn't like the looks of the PEP.  Too much
weirdness.  (path.py contains more weirdness, including some stuff
Guido particularly disliked, and I think it's fair to say that PEP355
suffered somewhat by association.)

- Any proposal to add a Second Way To Do It has to meet a very high
standard.  PEP355 was too big to be considered an incremental change.
Yet it didn't even attempt to fix all the perceived problems with the
existing APIs.  A more thorough job would have had a better chance.

- Nobody liked the API design--too many methods.

- Now we're hearing rumors of better ideas out there, which comes as a relief.

I suspect any one of these could have scuttled the proposal.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PyFAQ: thread-safe interpreter operations

2006-11-27 Thread Jason Orendorff
Way back on 11/22/06, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> Nick Coghlan schrieb:
> > Martin v. Löwis wrote:
> >> I personally consider it "good style" to rely on implementation details
> >> of CPython;
> >
> > Is there a 'do not' missing somewhere in there?
>
> No - I really mean it. I can find nothing wrong with people relying on
> reference counting to close files, for example. It's a property of
> CPython, and not guaranteed in other Python implementations - yet it
> works in a well-defined way in CPython. Code that relies on that feature
> is not portable, but portability is only one goal in software
> development, and may be irrelevant for some projects.

It's not necessarily future-portable either.  Having your software not
randomly break over time is relevant for most nontrivial projects.

> Similarly, it's fine when people rely on the C type "int" to have
> 32-bits when used with gcc on x86 Linux.

Relying on behavior that's implementation-defined in a particular way
for a reason (like int being 32 bits on 32-bit hardware) is one thing.
 Relying on behavior that even the implementors might not be
consciously aware of (or consider important to retain across versions)
is another.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PyFAQ: thread-safe interpreter operations

2006-11-27 Thread Jason Orendorff
On 11/27/06, Aahz <[EMAIL PROTECTED]> wrote:
> On Mon, Nov 27, 2006, Jason Orendorff wrote:
> > Way back on 11/22/06, "Martin v. L?wis" <[EMAIL PROTECTED]> wrote:
> >> [...] I can find nothing wrong with people relying on
> >> reference counting to close files, for example. It's a property of
> >> CPython, and not guaranteed in other Python implementations - yet it
> >> works in a well-defined way in CPython. [...]
> >
> > [Feh.]
>
> We recently had this discussion at my day job.  We ended up agreeing
> that using close() was an encouraged but not required style, because to
> really avoid breakage we'd have to go with a full-bore try/except style
> for file handling, and that would require too many changes (especially
> without upgrading to 2.5, and we're still using 2.2/2.3).

Well, CPython's refcounting is something Python-dev is
(understatement) very conscious of.  I think I've even heard
assurances that it won't change Any Time Soon.  But this isn't the
case for every CPython implementation detail.  Remember what brought
all this up.  If it's obscure enough that Fredrik Lundh has to ask
around, I wouldn't bet the ranch on it.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Encouraging developers

2007-03-06 Thread Jason Orendorff
On 3/5/07, A.M. Kuchling <[EMAIL PROTECTED]> wrote:
> Any ideas for fixing this problem?

The current developer FAQ says:

  2.4   How can I become a developer?
  There's only one way to become a developer, and that's through
  the School of Hard Knocks.
  http://mail.python.org/pipermail/python-dev/2002-September/028725.html

That's a little glib.  And maybe inaccurate.  That message (by Raymond
Hettinger and probably not originally intended to be the first thing
developers-aspirant see) seems at odds with Martin's characterization,
in this thread, of Raymond's own experience.

I would submit a doc patch, but what's the use.  ;)

I should be explicit-- I greatly admire the python-dev community and
its processes.  I don't get the feeling much happens in private
e-mail.  Quite the opposite: it feels like important decisions are
regularly made on python-dev.  I don't think it's hard to contribute.
I don't think the stdlib is a huge mess of brokenness.  And I don't
think the community is either.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] [ 1669539 ] Change (fix!) os.path.isabs() semantics on Win32

2007-03-07 Thread Jason Orendorff
On 3/7/07, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> Terry Jones schrieb:
> > I do think the behavior can be improved, and that it should be fixed, but
> > at a place where other incompatible changes will also be being made,
>
> Indeed, 2.6 is such a place. Any feature release can contain
> incompatible behavior, and any feature release did contain incompatible
> behavior. Just look at the "porting to" sections of past whatsnew files.

While we're at it, patch 1669539 makes a similar incompatible change
to ntpath.isabs().  On Windows there are:

  - true relative paths, like Lib\ntpath.py
  - true absolute paths, like C:\Python25 and \\server\share
  - oddities, like C:ntpath.py and \Python25

isabs() is inconsistent about oddities:

  >>> ntpath.isabs(r'C:ntpath.py')
  False
  >>> ntpath.isabs(r'\Python25')
  True

I don't think there's any logic behind this behavior.  The current
documentation for isabs() is:

  isabs(path)
Return True if path is an absolute pathname (begins with a slash).

The patch makes isabs(oddity) return False.

I don't think existing code is a huge concern here.  Google Code
Search suggests that no one thinks about the oddities.  Most existing
code using isabs() has acceptable-but-slightly-odd behavior for
oddities, and that kind of code would have different
acceptable-but-slightly-odd behavior under the proposed change.  And
oddities are rare.

The patch is incomplete (no docs) but ripe for a note of encouragement
(or summary rejection) from a committer.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Patch 1644818: Allow importing built-in submodules

2007-03-12 Thread Jason Orendorff
On 3/12/07, Miguel Lobo <[EMAIL PROTECTED]> wrote:
> Anyway, I'm intrigued about this "review 5 other patches" procedure you
> suggest.  What exactly would be involved in such a review?  Please note that
> I hadn't touched CPython code before I wrote my patch and I haven't been
> following CPython development closely.

Hi Miguel,

This is how we suck you in...  ;)

You don't have to be an expert to review patches.  The following
procedure would qualify you:

1.  Find a patch that it appears no one has ever touched (0 comments,
assigned to nobody, etc.)

2.  Pretty much every patch should include a unit test and
documentation.  If something is missing from the patch you're looking
at, post a comment that says "Incomplete, no docs/tests".

3.  Repeat until you've commented on five patches.

If you find such clerical work beneath you, you can go further--build
Python from source, apply patches, and verify that they work.  It's
not hard (google "python developer faq").  But it's not required.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] minidom and DOM level 2

2007-03-22 Thread Jason Orendorff
The lib ref claims that minidom supports DOM Level 1.  Does anyone
know what parts of Level 2 are not implemented?  I wasn't able to find
anything offhand.  It seems to be more a matter of what's not
documented, or what's not covered by the regression tests.

So.  I'd be happy to do some diffing between the implementation,
documentation, tests, and the Recommendation, and submit patches for
whatever needs it.  If anyone thinks that's worthwhile.  Anyone?

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] minidom and DOM level 2

2007-03-23 Thread Jason Orendorff
On 3/23/07, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> Jason Orendorff schrieb:
> > The lib ref claims that minidom supports DOM Level 1.  Does anyone
> > know what parts of Level 2 are not implemented?  I wasn't able to find
> > anything offhand.
>
> I now looked at it closely, and the only thing missing from DOM Level
> 2 Core (that I could find) is the EntityReference interface, and
> Document::createEntityReference. I'm not sure what semantics goes with it.

OK, I think this is worthwhile then.  :)  I'll read the spec and submit
a patch.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Hindsight on Py_UNICODE_WIDE?

2007-03-23 Thread Jason Orendorff
Scheme is adding Unicode support in an upcoming standard:
(DRAFT) http://www.r6rs.org/document/lib-html/r6rs-lib-Z-H-3.html

I have two questions for the python-dev team about Python's Unicode
experiences.  If it's convenient, please take a moment to reply.
Thanks in advance.

1.  In hindsight, what do you think about PEP 261, the Py_UNICODE_WIDE
build option?  On balance, has this been good, bad, or indifferent?
What's good/bad about it?

2.  The idea of multiple string representations has come up (that is,
where all strings are Unicode, but in memory some are 8-bit, some
16-bit, and some 32-bit--each string uses the narrowest possible
representation).  This has been discussed here for Python 3000.  My
question is:  Is this for real?  How far along is it?  How likely is
it?

Thanks,
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] minidom and DOM level 2

2007-04-07 Thread Jason Orendorff
On 4/7/07, Andrew Clover <[EMAIL PROTECTED]> wrote:
> Jason Orendorff wrote:
> > OK, I think this is worthwhile then. :) I'll read the spec and submit
> > a patch.
>
> You're planning to implement EntityReference in minidom? That'll be fun!
> :-) One of the nastier corners of DOM and XML in general.

Mmm.  So I'm finding.  EntityReferences seem to force detailed knowledge
of entity handling into the DOM implementation.  expat doesn't expose
the information in a particularly helpful way.  In a word, blaargh.

I'd be happy to set this aside and work on Level 1 compliance:

> Incidentally minidom falls far short of passing even Level 1 Core for
> more reasons than omission of EntityReference. I noted the main known
> problems with it here:
>http://pyxml.sourceforge.net/topics/compliance.html

Very nice.  Thanks for posting this.  I don't suppose you'd be willing to
update it for Python 2.5, would you?

Martin, have you looked at this?

Some of these might be hard to fix, given expat.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] minidom and DOM level 2

2007-04-07 Thread Jason Orendorff
On 4/7/07, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> In any case, the *claim* certainly is that minidom supports
> level 2 core. Any proof to the contrary indicates a bug;
> patches are welcome.

OK-- I'll work on this.  I can fix the easy ones, anyway.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] minidom and DOM level 2

2007-04-13 Thread Jason Orendorff
On 4/13/07, Andrew Clover <[EMAIL PROTECTED]> wrote:
> Jason Orendorff wrote:
> > I don't suppose you'd be willing to update it for Python 2.5, would you?
>
> Can do, but at this point I'm not aware of any work having been done on
> the issues listed there between the 2.3 and 2.5 releases.

I've been running the DOM test suite against trunk, using your test
harness.  It's kind of alarming at first that over 100 tests fail.  :)
But many of the failures involve entity references.

An even larger portion involve error cases: we accept things we should
check and reject.  For example, doc.createElement('\t') should fail.
These are certainly bugs, and they're easy to fix.  I'm working
through them.

> The danger is people may be used to the "wrong" minidom behaviours,
> given they have been static for so long and are in many cases central to
> how minidom works.

When I get to these, I'll post about it.

-j
___
Python-Dev mailing list
[EMAIL PROTECTED]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] minidom -> new-style classes?

2007-04-17 Thread Jason Orendorff
I'm working on minidom's DOM Level 1 compliance, targeting Python 2.6.
We have some bugs involving DOM property behavior.  For example,
setting the nodeValue attribute of an Element is supposed to have no
effect.  We don't implement this.

The right way to implement these quirks is using new-style classes and
properties.  Right now minidom uses old-style classes and lots of
hackery, and it's pretty broken.  (Another example--there is an
Attr._set_prefix method, but it is *not* called from __setattr__.)

Surely nobody is subclassing these classes.  You don't subclass DOM
interfaces--the DOM doesn't work that way.  So this change should be
OK.  Right?

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] minidom -> new-style classes?

2007-04-18 Thread Jason Orendorff
On 4/17/07, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> Perhaps a rewrite could target 3.0 and 2.6 could use a backported
> version of this *if* py3k compatibility mode is enabled? I'd love to
> see at least the 3.0 version cleaned up.

A lot of these bugs can be fixed without forking.  I've been
conservative so far.  I looked at a diff this morning.  Even
pickled documents shouldn't break.

If this patch lands, I can look at further cleanup after that.

Thanks everyone,
-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Patch reviews and request

2007-04-23 Thread Jason Orendorff
OK, here's the patch I'd like to direct attention to:

http://python.org/sf/1704134
[ 1704134 ] minidom Level 1 DOM compliance
  This is only the first step toward DOM Level 1 compliance.  It fixes
  the stuff that's easy to fix.

Here are the patch reviews.  I put more detailed comments in the SF
tracker.

http://python.org/sf/1704547
[ 1704547 ] Use MoveFileEx() to implement os.rename() on windows
  -1.  This changes the documented behavior of a commonly used
  function.

http://python.org/sf/1678345
[ 1678345 ] A fix for the bug #1528074 [warning: quite slow]
  This can be rejected.

http://python.org/sf/1673007
[ 1673007 ] urllib2 requests history + HEAD support
  urllib2.urlopen() object seems like the wrong place for history to
  be attached.  These objects are pretty ephemeral, in my code anyway.
  Patch is extremely rough.  -1.

http://python.org/sf/1665292
[ 1665292 ] Datetime enhancements
  The patch here adds __int__ and __float__ to datetime.timedelta.
  I'm -1 on that.  It also implements > < == comparison between
  timedelta objects and numbers, which is right out--they don't have
  compatible hash codes.

http://python.org/sf/1652328
[ 1652328 ] stream writing support in wave.py
  This can be rejected.  (Update: After my comments on this one, Neal
  closed it.)

The following patches look good, but I didn't attempt to run them.  I
just read the source code.

http://python.org/sf/1669481
[ 1669481 ] subprocess: Support close_fds on Win32
  Looks good, and a definite +1.

http://python.org/sf/1704621
[ 1704621 ] interpreter crash when multiplying large lists
  Yep, it crashes.  Patch looks good.  +1.

http://python.org/sf/1692664
[ 1692664 ] warnings.py gets filename wrong for eval/exec
  Nice to have.  +1.

http://python.org/sf/1676135
[ 1676135 ] Remove trailing slash from --prefix
  Also nice to have.  +0.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Wither PEP 335 (Overloadable Boolean Operators)?

2007-05-19 Thread Jason Orendorff
On 5/18/07, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> While reviewing PEPs, I stumbled over PEP 335 ( Overloadable Boolean
> Operators) by Greg Ewing.

-1.  "and" and "or" affect the flow of control.  It's a matter
of taste, but I feel the benefit is too small here to add
another flow-control quirk.  I like that part of the language
to be simple.

Anyway, if this *is* done, logically it should cover
"(... if ... else ...)" as well.  Same use cases.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Order of operations

2007-08-30 Thread Jason Orendorff
On 8/29/07, Dirkjan Ochtman <[EMAIL PROTECTED]> wrote:
> Alexandre Vassalotti wrote:
> > C doesn't have an exponentiation operator. You use the pow() function, 
> > instead:
>
> Wouldn't it make more sense, then, to have unary +/- have higher
> precedence than the ** operator, so that -3**2 == 9?

No, that would have been really bad.  Anyone who's had high school
algebra expects -x**2 to be -(x**2) and not (-x)**2.

I think the weirdness comes from parsing -a/b as (-a)/b rather than
-(a/b).  It should be the latter, if compatibility with math notation
is more important than compatibility with C.   Oh well.  Maybe in
Python 4.  :)

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Removing the GIL (Me, not you!)

2007-09-12 Thread Jason Orendorff
On 9/12/07, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> Now we are getting into details: you do NOT have to lock
> an object to modify its reference count. An atomic
> increment/decrement operation is enough.

One could measure the performance hit incurred by using atomic
operations for refcounting by hacking a few macros -- right?

Deferred reference counting (DRC for short) might help...
http://www.memorymanagement.org/glossary/d.html#deferred.reference.counting

I can explain a little more how this works if anyone's interested.
DRC basically eliminates reference counting for locals--that is,
pointers from the stack to an object.  An object becomes refcounted
only when some other object gets a pointer to it.  The drawback is
that destructors aren't called quite as promptly as in true
refcounting.  (They're still called in the right order,
though--barring cycles, an object's destructor is called before its
children's destructors.)

What counts as "stack" is up to the implementation; typically it means
"the C stack".  This could be used to eliminate most refcounting in C
code, although listobject.c would keep it.  The amount of per-platform
assembly code needed is surprisingly small (and won't change, once
you've written it--the Tamarin JavaScript VM does this).

You could go further and treat the Python f_locals and interpreter
stack as "stack". I think this would eliminate all refcounting in the
interpreter.  Of course, it complicates matters that f_locals is
actually an object visible from Python.

Just a thought, not a demand, please don't flame me,
-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Removing the GIL (Me, not you!)

2007-09-13 Thread Jason Orendorff
On 9/13/07, Justin Tulloss <[EMAIL PROTECTED]> wrote:
> 1. Use message passing and transactions.  [...]
> 2. Do it perl style. [...]
> 3. Come up with an elegant way of handling multiple python processes. [...]
> 4. Remove the GIL, use transactions for python objects, [...]

The SpiderMonkey JavaScript engine takes a very different approach,
described here:
http://developer.mozilla.org/en/docs/SpiderMonkey_Internals:_Thread_Safety

The SpiderMonkey C API threading model should sound familiar:  C code
can assume that simple operations, like dictionary lookups, are atomic
and thread-safe.  C code must explicitly JS_SuspendRequest() before
doing blocking I/O or number-crunching (just like
Py_BEGIN_ALLOW_THREADS).  The main difference is that SpiderMonkey's
"requests" are not mutually exclusive, the way the GIL is.

SpiderMonkey does fine-grained locking for mutable objects to avoid
race conditions.  The clever bit is that SpiderMonkey's per-object
locking does *not* require a context switch or even an atomic
instruction, in the usual case where an object is *not* shared among
threads.  (Programs that embed SpiderMonkey therefore run faster if
they manage to ensure that threads share relatively few mutable
objects.  JavaScript doesn't have modules.)

Suppose Python went this route.  There would still have to be a
"stop-the-world" global lock, because the cycle collector won't work
if other threads are going about changing pointers.  (SpiderMonkey's
GC does the same thing.)  Retaining such a lock has another advantage:
this change could be completely backward-compatible to extensions.
Just use this global lock as the GIL when entering a non-thread-safe
extension (all existing extensions would be considered
non-thread-safe).

This means non-thread-safe extensions would be hoggish (but not much
worse than they are already!).  Making an existing extension
thread-safe would require some thought, but it wouldn't be terribly
hard.  In the simplest cases, the extension writer could just add a
flag to the type saying "ok, I'm thread-safe".

Refcounting is another major issue.  SpiderMonkey uses GC instead.
CPython would need to do atomic increfs/decrefs.  (Deferred
refcounting could mitigate the cost.)

The main drawback (aside from the amount of work) is the patent.
SpiderMonkey's license grants a worldwide, royalty-free license, but
not under the Python license.  I think this could be wrangled, if the
technical approach looks worthwhile.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [poll] New name for __builtins__

2007-11-29 Thread Jason Orendorff
On Nov 29, 2007 11:54 AM, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> But then I thought, what if we renamed the __builtin__ module instead
> to builtins, and left __builtins__ alone?

Hmm.  __builtins__ is a magic hook, but __builtin__-the-module isn't
the thing it hooks, exactly, not the way __import__ hooks import or
__iter__ hooks iter().  Really the __builtin__ module *implements* the
__builtins__ hook protocol.  It would be cool to have a name for
__builtin__ the module that suggests that.

I suggest sys.builtins.  The builtins module feels both central enough
and magical enough to belong in sys.  And a lot of other stuff in sys
has the same "it's fun but slightly crazy to tweak this knob" vibe.
And, for sandboxers, mysandbox.builtins seems like a nice parallel to
sys.builtins, with "sys" serving the bonus role of suggesting
"unrestricted access".

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] How to change path at compile time?

2008-01-11 Thread Jason Garber
Hello,

 

Is there any reasonable way to change the default sys.path at compile
time?  (ie. add a directory).

 

(I am aware of $PYTHONPATH for runtime)

 

--

Best Regards,

 

Jason Garber

Senior Systems Engineer

IonZoft, Inc.

 

(814) 941-2390

[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]> 

 

 

 

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] __eq__ vs hash

2008-04-04 Thread Jason Orendorff
On Fri, Apr 4, 2008 at 9:38 AM, Guido van Rossum <[EMAIL PROTECTED]> wrote:
>  What specific code breaks? Maybe we need to turn this into a warning
>  in order to be more backwards compatible?

I looked at Mercurial.

It doesn't use __hash__ at all.  It uses __eq__ in two files, three total uses:
http://hg.intevation.org/mercurial/crew/file/6c4e12682fb9/mercurial/commands.py
http://hg.intevation.org/mercurial/crew/file/6c4e12682fb9/mercurial/context.py

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] ',' precedence in documentation

2008-09-24 Thread Jason Orendorff
What I really want is for the need to be less common.  What if assert
recognized certain commonly used expression types and actually
generated appropriate error messages?

  >>> assert foo.answer == 42
  AssertionError: expected foo.answer == 42; actual: 'a suffusion of yellow'

Maybe that's too magical.  :(  Failing that, I wish the message could
look sort of like a comment.

assert cond || message

Yes, I know that symbol is used for something else in other languages...

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] opcode dispatch optimization

2008-12-31 Thread Jason Orendorff
On Wed, Dec 31, 2008 at 11:44 AM, Christian Heimes  wrote:
> The patch makes use of a GCC feature where labels can be used as values:
> http://gcc.gnu.org/onlinedocs/gcc/Labels-as-Values.html . I didn't know
> about the feature and got confused by the unary && operator.

Right.  SpiderMonkey (Mozilla's JavaScript interpreter) does this, and
it was good for a similar win on platforms that use GCC.  (It took me
a while to figure out why it was so much faster, so I think this patch
would be better with a few very specific comments!)

SpiderMonkey calls this optimization "threaded code" too, but this
isn't the standard meaning of that term. See:
http://en.wikipedia.org/wiki/Threaded_code

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Another Anonymous Block Proposal

2005-04-27 Thread Jason Diamond
 ideal if we could even lose the "do" keyword. I think that 
might make the grammar ambiguous, though. If it was possible, we could 
do this:

   process_file(path):
   process(file):
   for line in file:
   print line
   success():
   print 'file processed successfully!'
   error(exc):
   print 'an exception was raised during processing:', exc
Now the only difference between a normal call and a call with anonymous 
block parameters would be the presence of the trailing colon. I could 
live with the "do" keyword if this can't be done, however.

The only disadvantage to this syntax that I can see is that the simple 
case of opening a file and processing it is slightly more verbose than 
it is in Ruby. This is Ruby:

   File.open_and_process("testfile", "r") do |file|
   while line = file.gets
   puts line
   end
   end
This would be the Python equivalent:
   do open_and_process("testfile", "r"):
   process(file):
   for line in file:
   print line
It's one extra line in Python (I'm not counting lines that contain 
nothing but "end" in Ruby) because we have to specify the name of the 
block parameter. The extra flexibility that the proposed syntax has 
(being able to pass in multiple blocks) is worth this extra line, in my 
opinion.

If we wanted to optimize even further for this case, however, we could 
allow for an alternate form of the "do" statement that lets you only 
specify one anonymous block parameter. Maybe it would look like this:

   do open_and_process("testfile", "r") process(file):
   for line in file:
   print line
I don't really think this is necessary. I don't mind being verbose if it 
makes things clearer and simpler.

Here's some other ideas: use "def" instead of "with". They'd have to be 
indented to avoid ambiguity, though:

   do process_file(path):
   def process(file):
   for line in file:
   print line
   def success():
   print 'file processed successfully!'
   def error(exc):
   print 'an exception was raised during processing:', exc
The presence of the familiar def keyword should help people understand 
what's happening here.

Note that I didn't include an example but there's no reason why an 
anonymous block parameter couldn't return a value which could be used in 
the function calling the block.

Please, be gentle.
--
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Another Anonymous Block Proposal

2005-04-27 Thread Jason Diamond
Paul Svensson wrote:
 You're not mentioning scopes of local variables, which seems to be
 the issue where most of the previous proposals lose their balance
 between hairy and pointless...
My syntax is just sugar for nested defs. I assumed the scopes of local 
variables would be identical when using either syntax.

Do you have any pointers to that go into the issues I'm probably missing?
Thanks.
--
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] bytes.from_hex() [Was: PEP 332 revival in coordination with pep 349?]

2006-02-15 Thread Jason Orendorff
Instead of byte literals, how about a classmethod bytes.from_hex(), which works like this:

  # two equivalent things
  expected_md5_hash = bytes.from_hex('5c535024cac5199153e3834fe5c92e6a')

  expected_md5_hash = bytes([92, 83, 80, 36, 202, 197, 25, 145, 83, 227, 131, 79, 229, 201, 46, 106])

It's just a nicety; the former fits my brain a little better.  This would work fine both in 2.5 and in 3.0.

I thought about unicode.encode('hex'), but obviously it will continue
to return a str in 2.x, not bytes.  Also the pseudo-encodings
('hex', 'rot13', 'zip', 'uu', etc.) generally scare me.  And now
that bytes and text are going to be two very different types, they're
even weirder than before.  Consider:

  text.encode('utf-8') ==> bytes
  text.encode('rot13') ==> text
  bytes.encode('zip') ==> bytes
  bytes.encode('uu') ==> text (?)

This state of affairs seems kind of crazy to me.

Actually users trying to figure out Unicode would probably be better served if bytes.encode() and text.decode() did not exist.

-j

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] bytes.from_hex() [Was: PEP 332 revival in coordination with pep 349?]

2006-02-17 Thread Jason Orendorff
On 2/15/06, Guido van Rossum <[EMAIL PROTECTED]> wrote:
>  Actually users trying to figure out Unicode would probably be better served> if bytes.encode() and text.decode() did not exist.[...]It would be better if the signature of text.encode() always returned a
bytes object. But why deny the bytes object a decode() method if textobjects have an encode() method?
I agree, text.encode() and bytes.decode() are both swell.  It's the
other two that bother me.
I'd say there are two "symmetric" API flavors possible (t and b are
text and bytes objects, respectively, where text is a string type,either str or unicode; enc is an encoding name):- b.decode(enc) -> t; t.encode(enc) -> b- b = bytes(t, enc); t = text(b, enc)
I'm not sure why one flavor would be preferred over the other,although having both would probably be a mistake.
I prefer constructor flavor; the word "bytes" feels more concrete than
"encode".  But I worry about constructors being too overloaded.

>>> text(b, enc)  # decode
>>> text(mydict)  # repr
>>> text(b)   # uh... decode with default encoding?

-j

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Path PEP: some comments (equality)

2006-02-22 Thread Jason Orendorff
On 2/20/06, Mark Mc Mahon <[EMAIL PROTECTED]> wrote:
It seems that the Path module as currently defined leaves equalitytesting up to the underlying string comparison. My guess is that thisis fine for Unix (maybe not even) but it is a bit lacking for Windows.
Should the path class implement an __eq__ method that might do some ofthe following things: - Get the absolute path of both self and the other path - normcase both - now see if they are equal
This has been suggested to me many times.Unfortunately, since Path is a subclass of string, this breaks stuff in weird ways.For example:    'x.py' == path('x.py') == path('X.PY') == 'X.PY', but '
x.py' != 'X.PY'And hashing needs to be consistent with __eq__:    hash('x.py') == hash(path('X.PY')) == hash('X.PY') ???Granted these problems would only pop up in code where people are mixing Path and string objects.  But they would cause really obscure bugs in practice, very difficult for a non-expert to figure out and fix.  It's safer for Paths to behave just like strings.
-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: The "bytes" object

2006-02-23 Thread Jason Orendorff
On 2/22/06, Neil Schemenauer <[EMAIL PROTECTED]> wrote:
    @classmethoddef fromhex(self, data):data = "" '', data)return bytes(binascii.unhexlify(data))If it's to be a classmethod, I guess that should be "return self(
binascii.unhexlify(data))".-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: The "bytes" object

2006-02-27 Thread Jason Orendorff
Neil Schemenauer wrote:
> Ron Adam <[EMAIL PROTECTED]> wrote:
>> Why was it decided that the unicode encoding argument should be ignored
>> if the first argument is a string?  Wouldn't an exception be better
>> rather than give the impression it does something when it doesn't?
>
>From the PEP:
>
> There is no sane meaning that the encoding can have in that
> case.  str objects *are* byte arrays and they know nothing about
> the encoding of character data they contain.  We need to assume
> that the programmer has provided str object that already uses
> the desired encoding.
>
> Raising an exception would be a valid option.  However, passing the
> string through unchanged makes the transition from str to bytes
> easier.

Does it?

I am quite certain the bytes PEP is dead wrong on this.  It should be changed.

Suppose I have code like this:

def faz(s):
return s.encode('utf-16be')

If I want to transition from str to bytes, how should I change this code?

def faz(s):
return bytes(s, 'utf-16be')  # OOPS - subtle bug

This silently does the wrong thing when s is a str.  If I hadn't read
the PEP, I would confidently assume that bytes(str, encoding) ==
bytes(unicode, encoding), modulo the default encoding.  I'd be wrong. 
But there's a really good reason to think this.  Wherever a unicode
argument is expected in Python 2.x, you can pass a str and it'll be
silently decoded.  This is an extremely strong convention.  It's even
embedded in PyArg_ParseTuple().  I can't think of any exceptions to
the rule, offhand.

Is this special case special enough to break the rules?  Arguable.  I
suspect not.  But even if so, allowing the breakage to pass silently
is surely a mistake.  It should just refuse the temptation to guess,
and throw an exception--right?

Now you may be thinking:  the str/unicode duality of text, and the
bytes/text duality of the "str" type, are *bad* things, and we're
trying to get rid of them.  True.  My view is, we'll be rid of them in
3.0 regardless.  In the meantime, there is no point trying to pretend
that 2.0 "str" is bytes and not text.  It just ain't so; you'll only
succeed in confusing people and causing bugs.  (And in 3.0 you're
going to turn around and tell them "str" *is* text!)

Good APIs make simple, sensible, comprehensible promises.  I like
these promises:
  - bytes(arg) works like array.array('b', arg)
  - bytes(arg1, arg2) works like bytes(arg1.encode(arg2))

I dislike these promises:
  - bytes(s, [ignored]), where s is a str, works like array.array('b', s)
  - bytes(u, [encoding]), where u is a unicode,
works like bytes(u.encode(encoding))

It seems more Pythonic to differentiate based on the number of
arguments, rather than the type.

-j

P.S.  As someone who gets a bit agitated when the word "Pythonic" or
the Zen of Python is taken in vain, I'd like to know if anyone feels
I've done so here, so I can properly apologize.  Thanks.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] bytes thoughts

2006-03-01 Thread Jason Orendorff
1.  Maybe there should be a more obvious way to spell "bytes([0])*N". 
I went through "bytes([0]*N)" and "bytes('\0'*N)" before I realized
there was a memory-efficient way to do it.

1a. Likewise, slice-assignment nicely handles memmove(), but there's
no memset().

2.  Having a plural noun as a type name is awkward.  I wish we could
call it "buffer" (which, conveniently, also tells you that it's
mutable, even if you don't know the word "mutable" :-).  Alas.

3. I wrote a toy BytesIO class to go with the toy bytes object:
  http://wiki.python.org/moin/BytesIO
(I hope this isn't considered wiki abuse -- it seemed as worthy and
relevant as most of what's in there.)

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] String initialization (was: The "i" string-prefix: I18n'ed strings)

2006-04-12 Thread Jason Orendorff
A compiler hook on string initialization, eh?

I have a distantly related story--this isn't important, just another
random Python use case for the file.  (The i"xyzzy" proposal wouldn't
help this case.)

In scons, your SConscripts (makefiles, essentially) are Python source
code.  You typically have SConscripts throughout your source tree. 
Any SConscript could have something like this:

  sort_exe = Program('sort', ['main.c', 'timsort.c'])

The problem is dealing with relative filenames.  The only sane way to
resolve "main.c" to an abspath is relative to the source file that
physically contains that string literal token.[1]  But that's
impossible to determine at run time.

scons uses some cleverness to guess the directory.  It's always right,
except when it's wrong.  Maddening.

So, what does this have to do with string initialization hooks?  If
scons could "decorate" string constants as part of SConscript
compilation/execution, this problem could actually be solved.

-j

[1] Well, this is my opinion, but it's the right one.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 355 (object-oriented paths)

2006-04-20 Thread Jason Orendorff
Talin, everything you wrote is really compelling.  If path.py weren't
so ridiculously useful to me, I would be completely convinced.  :)

For example, I agree 100% with this:

> Another reason why I am a bit dubious about a class-based approach
> is that it tends to take anything that is related to a filepath and lump
> them into a single module.

...and this:

> one thing that irks me (and others) about the Path class in Java is
> that it makes no distinction between methods that are merely textual
> conversions, and methods which actually go out and touch the disk.

...until I remember that in practice, d.parent and d.files('*.txt') on
the same object; or f.ext and f.isfile(); are things I do all the time
without thinking.  I think I can see why.

Separate modules only make sense for separate use cases.  In
real-world code where you're "doing stuff with files and directories",
you're going to randomly need os.remove(), shutil.copyfile(),
os.path.isdir(), and/or glob.glob().  I have one big mental junk
drawer with all this stuff in it.  The way the stdlib partitions them
does not fit my brain.  I have trouble believing some other
theoretical partition would be much better, though I'd love to see
someone try.

Lastly-- Is nontrivial path manipulation really rare?  Practically
every program I write "does stuff with files and directories". 
Scripts often do little else; in larger programs, main() often does 5
or 50 lines of this kind of stuff, while the rest of the program is
mostly filesystem-unaware.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 3102: Keyword-only arguments

2006-05-01 Thread Jason Orendorff
On 4/30/06, Edward Loper <[EMAIL PROTECTED]> wrote
(referring to keyword-only arguments):
> I see two possible reasons:
>
>- A function's author believes that calls to the function will be
>  easier to read if certain parameters are passed by name, rather
>  than positionally; and they want to enforce that calling
>  convention on their users.  This seems to me to go against the
>  "consenting adults" principle.
>
>- A function's author believes they might change the signature in the
>  future to accept new positional arguments, and they will want to put
>  them before the args that they declare keyword-only.
>
> Both of these motivations seem fairly weak.  Certainly, neither seems to
> warrant a significant change to function definition syntax.

I disagree.  I think the use cases are more significant than you
suggest, and the proposed change less significant.

Readability and future-compatibility are key factors in API design. 
How well a language supports them determines how sweet its libraries
can be.

Even relatively simple high-level functions often have lots of clearly
inessential "options". When I design this kind of function, I often
wish for keyword-only arguments.  path.py's write_lines() is an
example.

In fact... it feels as though I've seen "keyword-only" arguments in a
few places in the stdlib.  Am I imagining this?

Btw, I don't think the term "consenting adults" applies.  To me, that
refers to the agreeable state of affairs where you, the programmer
about to do something dangerous, know it's dangerous and indicate your
consent somehow in your source code, e.g. by typing an underscore. 
That underscore sends a warning.  It tells you to think twice.  It
tells you the blame is all yours if this doesn't work.  It makes
consent explicit (both mentally and syntactically).

I'm +1 on the use cases but -0 on the PEP.  The proposed syntax isn't
clear; I think I want a new 'explicit' keyword or something.  (Like
that'll happen.  Pfft.)

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] total ordering.

2006-05-16 Thread Jason Orendorff
On 5/11/06, Vladimir 'Yu' Stepanov <[EMAIL PROTECTED]> wrote:
> If for Python-3000 similar it will be shown concerning types
> str(), int(), complex() and so on, and the type of exceptions
> will strongly vary, it will make problematic redefinition of
> behavior of function of sorting.

I don't see what you mean by "redefinition of behavior of function of
sorting".  Is this something a Python programmer might want to do?
Can you give an example?


On 5/16/06, Vladimir 'Yu' Stepanov <[EMAIL PROTECTED]> wrote:
> It will be possible it conveniently to use as exception of
> management by a stream, for indication of necessity to involve
> `.__r(eq|ne|le|lt|ge|gt|cmp)__()' a method. This kind of a class
> can carry out function, similarly to StopIteration for `.next()'.

There are no .__r(eq|ne|le|lt|ge|gt|cmp)__() methods, for a logical
reason which you might enjoy deducing yourself...

> At present time similar function is carried out with exception
> NotImplemented. This exception is generated in a number of
> mathematical operations. For this reason I ask to consider an
> opportunity of creation of a new class.

Can you explain this?  NotImplemented isn't an exception.
(NotImplementedError is, but that's something quite different.)
NotImplemented has exactly one purpose in Python, as far as I can
tell.  What mathematical operations do you mean?

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] total ordering.

2006-05-18 Thread Jason Orendorff
Vladimir,

Your examples seem to indicate that you've misunderstood the change
that's proposed for Python 3000.  Especially this:

On 5/17/06, Vladimir 'Yu' Stepanov <[EMAIL PROTECTED]> wrote:
> # BEGIN: Emulation python3000
> if type(a) is not type(b) and (
> not operator.isNumberType(a) or
> not operator.isNumberType(b)
> ):
> raise TypeError("python3000: not-comparable types", 
> (a,b))
> # END: Emulation python3000

Python 3000 will not do anything like this.  It'll try a.__cmp__(b),
and failing that b.__cmp__(a) (but imagine this using tp_ slots
instead of actual Python method calls), and if both return
NotImplemented, it'll throw a TypeError (rather than guess, which is
what it does now).

There's a lot of legacy oddness in PyObject_RichCompare() and its many
helper functions; presumably they'll delete some of that, but it won't
be anything you care about.

Comparison with None should also continue to work as it does now,
unless I missed something.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A Horrible Inconsistency

2006-05-26 Thread Jason Orendorff
On 5/26/06, Facundo Batista <[EMAIL PROTECTED]> wrote:
> I think that we can do one of the following, when we found "-1 * (1, 2, 3)":
>
> - Treat -1 as 0 and return an empty tuple (actual behavior).
> - Treat the negative as a reverser, so we get back (3, 2, 1).
> - Raise an error.

No, no, no.  The important invariant is that n * seq is
loop(seq)[:n*len(seq)] where loop(seq) is an endless loop of the
elements of seq.

So obviously, if n is negative, the result should be an infinite
sequence that's == to loop(seq).

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] partition() (was: Remove str.find in 3.0?)

2005-08-30 Thread Jason Orendorff
Concerning names for partition(), I immediately thought of break(). 
Unfortunately it's taken.

So, how about snap()?

head, sep, tail = line.snap(':')

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Adding a conditional expression in Py3.0

2005-09-20 Thread Jason Orendorff
On 9/20/05, Guido wrote:
> On 9/20/05, Jason Orendorff <[EMAIL PROTECTED]> wrote:
> > return (if q: q.popleft() else: None)
> > return (if q then q.popleft() else None)
> > return q ? q.popleft() : None
> >
> > Hmmm.  Score one for ?:.
>
> Why? Just because it's shorter?

Just a gut response to the look.  The verbose forms strike me as
cluttered in this particular case.

In the multiline case, it doesn't look like clutter because the
if/elif/else bits line up, which fits the way Python has already
trained my brain.

> (Oh, and a way to decide between colon or no colon: we're not using
> colons in list comps and genexprs either.)

(grin) Easily fixed:

print "average weight:", avg(for c in chefs: c.weight)
rdict = dict(for k, v in D.iteritems(): v, k)

Honestly, I think I would prefer this syntax.  Examples from real
code, before and after:

lines = [line for line in pr.block.body
 if line.logical_line.strip() != '']
lines = [for line in pr.block.body:
 if line.logical_line.strip() != '':
 line]

row.values = \
[line[col.start:col.end].strip() for col in columns]
row.values = \
[for col in columns: line[col.start:col.end].rstrip()]

return [p for p in self.listdir(pattern) if p.isdir()]
return [for p in self.listdir(pattern): if p.isdir(): p]

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 343 and __with__

2005-10-03 Thread Jason Orendorff
I'm -1 on PEP 343.  It seems ...complex.  And even with all the
complexity, I *still* won't be able to type

with self.lock: ...

which I submit is perfectly reasonable, clean, and clear.  Instead I
have to type

with locking(self.lock): ...

where locking() is apparently either a new builtin, a standard library
function, or some 6-line contextmanager I have to write myself.

So I have two suggestions.

1.  I didn't find any suggestion of a __with__() method in the
archives.  So I feel I should suggest it.  It would work just like
__iter__().

class RLock:
@contextmanager
def __with__(self):
self.acquire()
try:
yield
finally:
self.release()

__with__() always returns a new context manager object.  Just as with
iterators, a context manager object has "cm.__with__() is cm".

The 'with' statement would call __with__(), of course.

Optionally, the type constructor could magically apply @contextmanager
to __with__() if it's a generator, which is the usual case.  It looks
like it already does similar magic with __new__().  Perhaps this is
too cute though.

2.  More radical:  Let's get rid of __enter__() and __exit__().  The
only example in PEP 343 that uses them is Example 4, which exists only
to show that "there's more than one way to do it". It all seems fishy
to me.  Why not get rid of them and use only __with__()?  In this
scenario, Python would expect __with__() to return a coroutine (not to
say "iterator") that yields exactly once.

Then the "@contextmanager" decorator wouldn't be needed on __with__(),
and neither would any type constructor magic.

The only drawback I see is that context manager methods implemented in
C will work differently from those implemented in Python.  Since C
doesn't have coroutines, I imagine there would have to be enter() and
exit() slots.  Maybe this is a major design concern; I don't know.

My apologies if this is redundant or unwelcome at this date.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 343 and __with__

2005-10-03 Thread Jason Orendorff
Phillip J. Eby writes:
> You didn't offer any reasons why this would be useful and/or good.

It makes it dramatically easier to write Python classes that correctly
support 'with'.  I don't see any simple way to do this under PEP 343;
the only sane thing to do is write a separate @contextmanager
generator, as all of the examples do.

Consider:

# decimal.py
class Context:
...
def __enter__(self):
???
def __exit__(self, t, v, tb):
???

DefaultContext = Context(...)

Kindly implement __enter__() and __exit__().  Make sure your
implementation is thread-safe (not easy, even though
decimal.getcontext/.setcontext are thread-safe!).  Also make sure it
supports nested 'with DefaultContext:' blocks (I don't mean lexically
nested, of course; I mean nested at runtime.)

The answer requires thread-local storage and a separate stack of saved
context objects per thread.  It seems a little ridiculous to me.

Whereas:

class Context:
...
def __with__(self):
old = decimal.getcontext()
decimal.setcontext(self)
try:
yield
finally:
decimal.setcontext(old)

As for the second proposal, I was thinking we'd have one mental model
for context managers (block template generators), rather than two
(generators vs. enter/exit methods).  Enter/exit seemed superfluous,
given the examples in the PEP.

> [T]his multiplies the difficulty of implementing context managers in C.

Nonsense.

static PyObject *
lock_with()
{
return PyContextManager_FromCFunctions(self, lock_acquire,
lock_release);
}

There probably ought to be such an API even if my suggestion is in
fact garbage (as, admittedly, still seems the most likely thing).

Cheers,
-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 343 and __with__

2005-10-04 Thread Jason Orendorff
The argument I am going to try to make is that Python coroutines need
a more usable API.

> Try to explain the semantics of the with statement without referring to the
> __enter__ and __exit__ methods, and then see if you still think they're
> superfluous ;)
>
> The @contextmanager generator decorator is just syntactic sugar [...]
> [T]he semantics of the with statement itself can
> only be explained in terms of the __enter__ and __exit__ methods.

That's not true.  It can certainly use the coroutine API instead.

Now... as specified in PEP 342, the coroutine API can be used to
implement 'with', but it's ugly.  I think this is a problem with the
coroutine API, not the idea of using coroutines per se.  Actually I
think 'with' is a pretty tame use case for coroutines.  Other Python
objects (dicts, lists, strings) have convenience methods that are
strictly redundant but make them much easier to use.  Coroutines
should, too.

This:

with EXPR as VAR:
BLOCK

expands to this under PEP 342:

_cm = contextmanager(EXPR)
VAR = _cm.next()
try:
BLOCK
except:
try:
_cm.throw(*sys.exc_info())
except:
pass
raise
finally:
try:
_cm.next()
except StopIteration:
pass
except:
raise
else:
raise RuntimeError

Blah.  But it could look like this:

_cm = (EXPR).__with__()
VAR = _cm.start()
try:
BLOCK
except:
_cm.throw(*excinfo)
else:
_cm.finish()

I think that looks quite nice.

Here is the proposed specification for start() and finish():

class coroutine:  # pseudocode
...
def start(self):
""" Convenience method -- exactly like next(), but
assert that this coroutine hasn't already been started.
"""
if self.__started:
raise ValueError  # or whatever
return self.next()

def finish(self):
""" Convenience method -- like next(), but expect the
coroutine to complete without yielding again.
"""
try:
self.next()
except (StopIteration, GeneratorExit):
pass
else:
raise RuntimeError("coroutine didn't finish")

Why is this good?

  - Makes coroutines more usable for everyone, not just for
implementing 'with'.
  - For example, if you want to feed values to a coroutine, call
start() first and then send() repeatedly.  Quite sensible.
  - Single mental model for 'with' (always uses a coroutine or
lookalike object).
  - No need for "contextmanager" wrapper.
  - Harder to implement a context manager object incorrectly
(it's quite easy to screw up with __begin__ and __end__).

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 343 and __with__

2005-10-04 Thread Jason Orendorff
Right after I sent the preceding message I got a funny feeling I'm
wasting everybody's time here.  I apologize.  Guido's original concern
about speedy C implementation for locks stands.  I don't see a good
way around it.

By the way, my expansion of 'with' using coroutines (in previous
message) was incorrect.  The corrected version is shorter; see below.

-j


This:

with EXPR as VAR:
BLOCK

would expand to this under PEP 342 and my proposal:

_cm = (EXPR).__with__()
VAR = _cm.next()
try:
BLOCK
except:
_cm.throw(*sys.exc_info())
finally:
try:
_cm.next()
except (StopIteration, GeneratorExit):
pass
else:
raise RuntimeError("coroutine didn't finish")
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] __doc__ behavior in class definitions

2005-10-07 Thread Jason Orendorff
Martin,

These two cases generate different bytecode.

def foo(): # foo.func_code.co_flags == 0x43
print x# LOAD_FAST 0
x = 3

class Foo: # .co_flags == 0x40
print x# LOAD_NAME 'x'
x = 3

In functions, local variables are just numbered slots. (co_flags bits
1 and 2 indicate this.)  The LOAD_FAST opcode is used.  If the slot is
empty, LOAD_FAST throws.

In other code, the local variables are actually stored in a
dictionary.  LOAD_NAME is used.  This does a locals dictionary lookup;
failing that, it falls back on the globals dictionary; and failing
that, it falls back on builtins.

Why the discrepancy?  Beats me.  I would definitely implement what
CPython does up to this point, if that's your question.

Btw, functions that use 'exec' are in their own category way out
there:

def foo2(): # foo2.func_code.co_flags == 0x42
print x # LOAD_NAME 'x'
exec "x=3"  # don't ever do this, it screws everything up
print x

Pretty weird.  Jython seems to implement this.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Proposed changes to PEP 343

2005-10-11 Thread Jason Orendorff
On 10/7/05, Fredrik Lundh <[EMAIL PROTECTED]> wrote:
> the whole concept might be perfectly fine on the "this construct corre-
> sponds to this code" level, but if you immediately end up with things that
> are not what they seem, and names that don't mean what the say, either
> the design or the description of it needs work.
>
>  ("yes, I know you can use this class to manage the context, but it's not
> really a context manager, because it's that method that's a manager, not
> the class itself.  yes, all the information that belongs to the context are
> managed by the class, but that doesn't make... oh, shut up and read the
> PEP")

Good points... Maybe it is the description that needs work.

Here is a description of iterators, to illustrate the parallels:
An object that has an __iter__ method is iterable.  It can plug
into the Python 'for' statement.  obj.__iter__() returns an
iterator.  An iterator is a single-use, forward-only view of a
sequence.  'for' calls __iter__() and uses the resulting
iterator's next() method.

(This is just as complicated as PEP343+changes, but not as
mindboggling, because the terminology is better.  Also because
we're used to iterators.)

Now contexts, per PEP 343 with Nick's proposed changes:
An object that has a __with__ method is a context.  It can plug
into the Python 'with' statement.  obj.__with__() returns a
context manager.  A context manager is a single-use object that
manages a single visit into a context.  'with' calls __with__()
and uses the resulting context manager's __enter__() and
__exit__() methods.

A contextmanager is a function that returns a new context manager.

Okay, that last bit is weird.  But note that PEP 343 has this oddness
even without the proposed changes.  Perhaps either "context manager"
or contextmanager should be renamed, regardless of whether Nick's
changes are accepted.

With the changes, context managers will be (conceptually) single-use.
So maybe a different term might be appropriate.  Perhaps "ticket".
"A ticket is a single-use object that manages a single visit into a
context."

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Divorcing str and unicode (no more implicit conversions).

2005-10-23 Thread Jason Orendorff
-1 on keeping the source encoding of string literals.  Python should
definitely decode them at compile time.

-1 on decoding implicitly "as needed".  This causes decoding to happen
late, in unpredictable places.  Decodes can fail; they should happen
as early and as close to the data source as possible.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Jython and CPython

2005-12-13 Thread Jason Orendorff
On 12/13/05, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
Fredrik Lundh wrote:> BTW, what's the policy wrt. Jython-specific modules in the standard library?I don't think there is enough precedence to have a policy. So far, theonly places that explicitly support Jython is the test suite, pickle,
and platform (I wouldn't really count in site here).

Actually there's some Jython-specific code in
xml/sax/__init__.py.  Two places, both questionable.  One of
them refers to sys.registry.  The other appears to be a workaround
for Jython not having 4-argument __import__.

> If the portability problem can be solved by checking things into Jython
> instead, I think I would prefer that.

Yes, it can be solved that way: Jython could implement pyexpat.  I
don't know just how crazy that idea is; my impression is that it could
be done, perhaps imperfectly, as a wrapper around SAX.

-j

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 8 updates/clarifications

2005-12-13 Thread Jason Orendorff
Barry Warsaw wrote:
>   - If your class is intended to be subclassed, and you have attributes
> that you do not want subclasses to use, consider naming them with
> double leading underscores and no trailing underscores.  This invokes
> Python's name mangling algorithm, where the name of the class is
> mangled into the attribute name.  This helps avoid attribute name
> collisions should subclasses inadvertently contain attributes with the
> same name.
>
> Note 1: Note that only the simple class name is used in the mangled
> name, so if a subclass chooses both the same class name and attribute
> name, you can still get name collisions.
>
> Note 2: Name mangling can make certain uses, such as debugging, less
> convenient.  However the name mangling algorithm is well documented
> and easy to perform manually.

Hmm.  How about just:  "Put two leading underscores on an attribute's
name to strongly discourage code outside the class from accessing it."

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] ElementTree in stdlib

2005-12-13 Thread Jason Orendorff
On 12/13/05, Walter Dörwald <[EMAIL PROTECTED]> wrote:
> Guido van Rossum wrote:
> > I don't think that SAX is unpythonic, but it's pretty low-level and
> > mostly of use to people writing higher-level XML parsers (my parsexml
> > module uses it).
>
> Having to define classes that conform to a certain API and registering
> instances of those classes as callbacks with the parser doesn't look
> that pythonic to me. An iterator API seems much more pythonic.

Strongly agree.  This very morning I wrote a long tirade about how I
wish Python had true coroutines, for the sole reason that I could wrap
SAX in an iterator-based API.

Eventually I decided it was SAX's fault for having such a crummy API,
so I didn't post it.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] ElementTree in stdlib

2005-12-14 Thread Jason Orendorff
Guido van Rossum wrote:
> On 12/13/05, Walter Dörwald <[EMAIL PROTECTED]> wrote:
> > Having to define classes that conform to a certain API and registering
> > instances of those classes as callbacks with the parser doesn't look
> > that pythonic to me. An iterator API seems much more pythonic.
>
> Perhaps. Although the SAX API lets you leave a callback undefined if
> you don't have a need to handle those events; that's a bit trickier to
> do with an iterator.

Well, suppose you want to dump the text of a document.

for e in iterparse(filename):
if e.isText():
out.write(e.data)

Not tricky.

> > Also the different callbacks have different signatures.

True.  With SAX I always have to look up the signatures.  The iterator
yields Node-like objects in document order.  I don't have to remember
signatures.

But the biggest advantage of an iterator-based API would be: when you
hit an element, you can easily pass control to a function that knows
how to parse that particular element.  parsePlay() can call
parseAct(), which can call parseScene().  To do anything like that
with SAX, you have to write a bunch of dispatch code.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Ph.D. dissertation ideas?

2006-01-14 Thread Jason Orendorff
Brett,

You could create a downloadable corpus of Python source code, and
maybe a web site through which people can easily browse/search it,
contribute to it, and maintain it.  The point would be to support
language designers, tool developers, and researchers.  Several
python-dev folks have their own corpuses; I think other people would
be happy to use a free one if it were out there.

Of course there's no need to limit it to Python...

Creating a really *good* corpus is maybe not super-easy; I imagine
there are myriad linguistics papers explaining the nuances.  Hey,
cross-discipline research--cool points!

Once this exists, there's no shortage of research questions you can
quickly and easily answer with it.  What percentage of Python programs
use functional programming techniques?  How often are list
comprehensions used?  What do people use generators for?

And if you do something web-based, you can certainly work XML in there
somewhere.  :)

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] str with base

2006-01-17 Thread Jason Orendorff
It seems dumb to support *parsing* integers in weird bases, but not
*formatting* them in weird bases.  Not a big deal, but if you're going
to give me a toy, at least give me the whole toy!

The %b idea is a little disappointing in two ways.  Even with %b,
Python is still dumb by the above criterion.  And, I suspect users
that don't know about %b are unlikely to find it when they want it.  I
know I've never looked for it there.

I think a method 5664400.to_base(13) sounds nice.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] str with base

2006-01-18 Thread Jason Orendorff
On 1/18/06, Donovan Baarda <[EMAIL PROTECTED]> wrote:
> I think supporting arbitrary bases for floats is way overkill and not
> worth considering.

If you mean actual base-3 floating-point arithmetic, I agree.  That's
outlandish.

But if there were a stdlib function to format floats losslessly in hex
or binary, Tim Peters would use it at least once every six weeks to
illustrate the finer points of floating point arithmetic. <0.00390625
wink>

+1.0

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] basenumber redux

2006-01-18 Thread Jason Orendorff
On 1/17/06, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> Alex Martelli wrote:
> > But this doesn't apply to the Python Standard Library, for example see
> > line 1348 of imaplib.py: "if isinstance(date_time, (int, float)):".
> [...]
> > Being able to change imaplib to use basenumber instead of (int, float)
> > won't make it SIMPLER, but it will surely make it BETTER -- why should
> > a long be rejected, or a Decimal, for that matter?
>
> Right. I think this function should read
>
>   if isinstance(date_time, str) and \
>  (date_time[0],date_time[-1]) == ('"','"'):
> return date_time# Assume in correct format
>
>   if isinstance(date_time, (tuple, time.struct_time)):
>  tt = date_time
>   else:
>  tt = time.localtime(date_time)

So... arbitrary number-like objects should work, but arbitrary
sequence-like objects should fail?  Hmmm.  Maybe that "if
isinstance()" line should say "if hasattr(date_time, '__getitem__'):".
 Am I sure?  No.  The original author of imaplib apparently got it
wrong, and Martin got it wrong, and they're both smarter than me.

Really this is just further proof that type-checking is a royal pain
in Python.  Or rather, it's not hard to cover the builtin and stdlib
types, but it's hard to support "duck typing" too.  Are we going about
this the right way?  Options:

1.  Redesign the API so each parameter has a clearly defined set of
operations it must support, thus eliminating the need for
type-checking.  Drawback:  An annoying API might be worse than the
problem we're trying to solve.

2.  Write a set of imprecise, general-purpose type-checking functions
(is_real_number(v), is_sequence(v), ...) and use those.  (They are
imprecise because the requirements are vague and because it's not
really possible to pin them down.)  Drawback:  Error-prone, compounded
by deceptively clean appearance.

3.  Write very specific custom type-checking code each time you need
it (the imaplib approach).  Drawbacks:  Error-prone (as we've seen),
precarious, tedious, unreadable.

4.  Use the "better-to-ask-forgiveness-than-permission" idiom. 
Drawback:  Potential bad behavior on error, again potentially worse
than the original problem.

Yuck.  Does anyone have the answer to this one?  Or is the problem not
as bad as it looks?

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 343 and __context__()

2006-01-19 Thread Jason Orendorff
I just noticed that my name is in PEP 343 attached to the idea of the
__context__() method, and I'm slightly queasy over it.

The rationale was to help e.g. decimal.DecimalContext support 'with'. 
Maybe that's a bad idea.

DecimalContext has a few problems.  In code where it matters, every
function you write has to worry about it. (That is, you can't just
write __decimal_context__ = ... at the top of the file and be done
with it, the way you can with, say, __metaclass__.)  And
DecimalContext doesn't fit in with generators.

sys.stdout has similar problems.

It feels like PEP 343 will make people want to follow this model. 
That is, we'll see more global behavior-controlling variables in the
future.  There are grizzlier fates; I just wondered if anyone had
thought of this.

Cheers,
-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 343 and __context__()

2006-01-20 Thread Jason Orendorff
On 1/20/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> Jason Orendorff wrote:
> > DecimalContext has a few problems.  In code where it matters, every
> > function you write has to worry about it. (That is, you can't just
> > write __decimal_context__ = ... at the top of the file and be done
> > with it, the way you can with, say, __metaclass__.)
>
> No, you write "decimal.setcontext(...)" instead.

You seem to be implying these are roughly equal in convenience; I
disagree.  Suppose I have banking.py, in which it's important to use a
particular precision and rounding.  Now I have to put context-munging
code in every single function that banking.py exposes.  And around
every 'yield'.  Even with 'with', that's a lot of extra lines of code.

I'd much prefer to put a one-liner at the top of the file, if it were
possible (...but I don't see how, yet).

Again, none of this is likely to matter--unless you're interleaving
banking and heavy scientific calculations, which I try to avoid.  So,
not a big deal.  Thanks for the response.

> >  And
> > DecimalContext doesn't fit in with generators.
>
> It does fit actually - you simply have to remember to restore the original
> context around any invocations of yield.

Feh!  "Fit" is to "can be made to work with a bit of effort, just
don't forget to follow the rules" as Python is to C++.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The path module PEP

2006-01-24 Thread Jason Orendorff
Thanks for doing this.  I'm not sure anyone that matters here is
actually keen on path, but I guess we'll see.  A few comments:

On 1/24/06, BJörn Lindqvist <[EMAIL PROTECTED]> wrote:
> The following points summarizes the design:
>
> - Path extends from string, therefore all code which expects
>   string pathnames need not be modified and no existing code will
>   break.

Actually, I would prefer a Path that *didn't* subclass string, and a
new "%p" format-thingy in PyArg_ParseTuple().  %p would expect either
a Path object or a string.  Stdlib C functions that take paths would
be changed from using %s or %u to %p.  This would pretty much
eliminate the need for path objects to act like strings (except where
__cmp__, __hash__, and common sense dictate).

The only reason I didn't do this in path.py is that I don't have the
required write access to the Python source tree. ;)  Subclassing
str/unicode seemed like the next best thing.


> [...]omitted:
> * Function for opening a path - better handled by the builtin
>   open().

Aside:  I added this to support a few people who liked the idea of
"openable objects", meaning anything that has .open(), analogous to
"writeable objects" being anything with .write().  I don't use it
personally.

Examples 1 and 2 have errors.  In example 1, the "after" code should be:

  d = path('/usr/home/guido/bin')
  for f in d.files('*.py'):
  f.chmod(0755)

In example 2, the "before" code is missing a line -- the call to
os.path.walk().  (Actually it should probably use os.walk(), which
looks much nicer.)

I suspect you'll be asked to change the PEP to remove __div__ for
starters, in which case I propose using the Path constructor as the
replacement for os.path.join().  In that case, Path.joinpath can be
dropped.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The path module PEP

2006-01-25 Thread Jason Orendorff
On 1/25/06, Toby Dickenson <[EMAIL PROTECTED]> wrote:
> On Tuesday 24 January 2006 20:22, BJörn Lindqvist wrote:
> > #Replacing glob.glob
> > glob.glob("/lib/*.so")
> > ==>
> > Path("/lib").glob("*.so")
>
> This definition seems confusing because it splits the glob pattern string in
> two ('/lib', and '*.so'). [...]

Well, let's make this look more like real code:

#line 1
LIB_DIR = "/lib"
==>
LIB_DIR = Path("/lib")

#line 296
libs = glob.glob(os.path.join(LIB_DIR, "*.so"))
==>
libs = LIB_DIR.files("*.so")

Clearer?  In d.files(pattern), d is simply the root directory for the
search.  The same is true of all the searching methods: dirs(),
walkfiles(), walkdirs(), etc.

I actually never use path.glob().  For example, here files() is
actually more accurate, and the word "files" is surely clearer than
"glob".  Given files(), dirs(), and listdir(), I have never found a
real use case for glob().

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The path module PEP

2006-01-25 Thread Jason Orendorff
On 1/24/06, Ian Bicking <[EMAIL PROTECTED]> wrote:
> There's kind of a lot of methods in here, which is a little bothersome.
> It also points towards the motivation for the class -- too many
> options in too many places in the stdlib.  But throwing them *all* in
> one class consolidates but doesn't simplify, especially with duplicate
> functionality.

I agree.

Let me explain why there's so much cruft in path.py.  The design is
heavily skewed toward people already very familiar with the existing
stdlib equivalents, because that is the market for third-party
modules.  I think my users want to type p.methodname() for whatever
method name they already know, and have it *just work*.   A sloppy API
you've already learned is easier to pick up than a clean API you've
never seen before.  Ergo, cruft.

A stdlib Path should have different design goals.  It should have less
redundancy, fewer methods overall, and PEP-8-compliant names.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] / as path join operator

2006-01-27 Thread Jason Orendorff
It's controversial that Path subclasses str.  Some people think it's
flat-out wrong.  Even Bjorn argues that it's a practicality-vs-purity
tradeoff.  But a strong argument can be made that Path *should* be a
string subclass, practicality be damned.  Proof follows.

I. Here's an example of the sort of thing you might say if you did
*not* think of paths as strings:

On 1/25/06, Stephen J. Turnbull <[EMAIL PROTECTED]> wrote:
> I think it's logical to expect that
> Path('home') / 'and/or'
> points to a file named "and/or" in directory "home", not to a file
> named "or" in directory "home/and".

This makes no sense whatsoever.  Ergo, by reductio ad absurdum, paths
are strings.

II. And here is the sort of thing you'd say if you thought of paths
*solely* as strings:

> (2) Note that '/' is also the path separator used by URIs, which RFC
> 2396 gives different semantics from Unix.  Most of my Python usage to
> date has been heavily web-oriented, and I'd have little use for /
> unless it follows RFC 2396.

The quandary is resolved by pointing out that URIs are not paths (in
the sense of os.path and generally this whole horrible thread).  Thus
not all strings are paths.

Hence the paths are a proper subset of the strings.  By the existence
of os.path, they have their own commonly-used operations.  By
definition, then, Path is a subclass of string, QED.


Do I really buy all this?  I dunno.  To say "paths aren't strings" is
all very well, and in a very abstract sense I almost agree--but you
have to admit it sort of flies in the face of, you know, reality. 
Filesystem paths are in fact strings on all operating systems I'm
aware of.  And it's no accident or performance optimization.  It's
good design.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] / as path join operator

2006-01-30 Thread Jason Orendorff
On 1/28/06, Stephen J. Turnbull <[EMAIL PROTECTED]> wrote:
> Please note that my point was entirely different from trying to decide
> whether to subclass strings.

Noted -- sorry I took you out of context there; that was careless.

> Jason> Filesystem paths are in fact strings on all operating
> Jason> systems I'm aware of.
>
> I have no idea what you could mean by that.  The data structure used
> to represent a filesystem on all OS filesystems I've used is a graph
> of directories and files.  A filesystem object is located by
> traversing a path in that graph.

You seem to think that because I said "operating systems", I'm talking
about kernel algorithms and such.  I'm not.  By "on all operating
systems" I really mean systems, not kernels:  system APIs, standard
tools, documentation, the conventions everyone follows--that sort of
thing.  Userspace.

Thought experiment:  How are filesystem paths used?  Well, programs
pass them into system calls like open() and chmod().  Programs use
them to communicate with other programs.  Users pass them to programs.
 Compare this to how you'd answer the question "How are integers
used?":  I think paths are used more for communication, less for
computation.  Their utility for communication is tightly bound to
their string-nature.

Essentially all APIs involving filesystem paths treat them as strings.
 It's not just that they take string parameters.  The way they're
designed, they encourage users to think of paths as strings, not
graph-paths.  Java's stdlib is the only API that even comes close to
distinguishing paths from strings.  The .NET class library doesn't
bother.  Many many people much smarter than I have thought about
creating non-string-oriented filesystem APIs.  Somehow it hasn't
caught on.

Essentially all users expect to see a filesystem path as a string of
characters in the conventional format.  Display it any other way (say,
as a sequence of edge-names) and you risk baffling them.

My position is (a) the convention that paths are strings really does
exist, embedded in the design and culture of the dominant operating
systems--in fact it's overwhelming, and I'm surprised anyone can miss
it; (b) there might be a reason for it, even aside from momentum.

-j
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Another case for frozendict

2014-07-13 Thread Jason R. Coombs
I repeatedly run into situations where a frozendict would be useful, and every 
time I do, I go searching and find the (unfortunately rejected) PEP-416. I'd 
just like to share another case where having a frozendict in the stdlib would 
be useful to me.

I was interacting with a database and had a list of results from 206 queries:

>>> res = [db.cases.remove({'_id': doc['_id']}) for doc in fives]
>>> len(res)
206

I can see that the results are the same for the first two queries.

>>> res[0]
{'n': 1, 'err': None, 'ok': 1.0}
>>> res[1]
{'n': 1, 'err': None, 'ok': 1.0}

So I'd like to test to see if that's the case, so I try to construct a 'set' on 
the results, which in theory would give me a list of unique results:

>>> set(res)
Traceback (most recent call last):
  File "", line 1, in 
TypeError: unhashable type: 'dict'

I can't do that because dict is unhashable. That's reasonable, and if I had a 
frozen dict, I could easily work around this limitation and accomplish what I 
need.

>>> set(map(frozendict, res))
Traceback (most recent call last):
  File "", line 1, in 
NameError: name 'frozendict' is not defined

PEP-416 mentions a MappingProxyType, but that's no help.

>>> res_ex = list(map(types.MappingProxyType, res))
>>> set(res_ex)
Traceback (most recent call last):
  File "", line 1, in 
TypeError: unhashable type: 'mappingproxy'

I can achieve what I need by constructing a set on the 'items' of the dict.

>>> set(tuple(doc.items()) for doc in res)
{(('n', 1), ('err', None), ('ok', 1.0))}

But that syntax would be nicer if the result had the same representation as the 
input (mapping instead of tuple of pairs). A frozendict would have readily 
enabled the desirable behavior.

Although hashability is mentioned in the PEP under constraints, there are many 
use-cases that fall out of the ability to hash a dict, such as the one 
described above, which are not mentioned at all in use-cases for the PEP.

If there's ever any interest in reviving that PEP, I'm in favor of its 
implementation.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Windows buildbots may be broken

2021-07-30 Thread Jason R. Coombs
Jeremy informed 
me that 
due to a race condition on a test file and the CPython repo configuration for 
newline conversion, build bots on Windows may now be failing.

If you run such a buildbot, please consider running this command on your repo 
to bypass the issue:

git rm -r :/ ; git checkout HEAD -- :/


You may want to consider adding this command after every update to the repo to 
avoid the stale state.


___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/EZPAFN3BSVDBZC7CRAJEFIQVM6JOSGU5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Windows buildbots may be broken

2021-08-03 Thread Jason R. Coombs
That command causes a re-initialization of all of the files in the working 
copy. I got the recipe from here<https://stackoverflow.com/a/56457412/70170>. 
There are more details in the comment I linked originally and the subsequent 
response.


From: Senthil Kumaran 
Sent: Tuesday, August 3, 2021 09:29
To: Jason R. Coombs 
Cc: [email protected] 
Subject: Re: [Python-Dev] Windows buildbots may be broken

On Fri, Jul 30, 2021 at 02:28:08PM +, Jason R. Coombs wrote:

> If you run such a buildbot, please consider running this command on
> your repo to bypass the issue:
>
> git rm -r :/ ; git checkout HEAD -- :/
>
> You may want to consider adding this command after every update to the
> repo to avoid the stale state.

What does this do? Especially the first command. Is this Windows specific?

--
Senthil



___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/TSQBGQRRQFQFVV57PT5JZBEXTSB4ZLIJ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Descriptions in unittest and avoiding confusion

2022-04-03 Thread Jason R. Coombs
For the edification of all involved, this post summarizes a somewhat surprising 
behavior in unittest around docstrings.

In bpo-46126, I reported an issue where I’d 
observed that CPython developers were avoiding the use of docstrings in 
unittests due to what was perceived as a masking of crucial information.

On further investigation, it turns out that with the “descriptions” setting of 
the TextTestRunner is set to True (the default), unittest will emit the first 
line of a test’s docstring (if present) in addition to the location of the 
test. As a result, for tests that have docstrings, the output varies depending 
on this setting. Verbose output with docstrings and descriptions enabled:

test_entry_points_unique_packages 
(test.test_importlib.test_metadata_api.APITests)
Entry points should only be exposed for the first package ... ERROR


Without docstrings or with descriptions disabled:

test_entry_points_unique_packages 
(test.test_importlib.test_metadata_api.APITests) ... ERROR


The output with the docstrings is more descriptive, providing more context and 
detail about the intention of the failed test. Because of the additional 
detail, however, unittest has elected to use a newline between the test 
location and the description, meaning the test result no longer appears on the 
same line as the test location, and thus if one were to grep for the test name, 
the result would be omitted, and if one were to grep for ERROR, the test name 
would be omitted.

As part of the investigation, I published In Python, use docstrings or 
comments? 
exploring the motivations for and value added by allowing docstrings in test 
functions.

It’s still an open consideration whether 
the unittest UX should format descriptions in the output differently to provide 
more consistent output while still allowing docstrings.

Based on this information, CPython will most likely (pending 
GH-32128) continue to use the 
default behavior (descriptions enabled) but will also allow for docstrings in 
its own tests.
___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/DAJSMSVL4S2EQVJD3HCG3KI6AIIXTURM/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: [python-committers] [RELEASE] Python 3.8.1rc1 is now available for testing

2019-12-10 Thread Jason R. Coombs
I think I missed the announcement of the cutoff date for 3.8.1; I was hoping to 
get some bug fixes in for 
importlib.metadata.

These aren’t crucial bugfixes, but it would be nice not to have them linger for 
months. Would you consider including these, especially as the code changes are 
pre-vetted in the backport (released 12-01)? Or maybe only if there’s another 
RC for another reason?

If it’s too disruptive, that’s no big deal. Your call.

Thanks for the release work.

On 10 Dec, 2019, at 04:22, Łukasz Langa 
mailto:[email protected]>> wrote:


Python 3.8.1rc1 is the release candidate of the first maintenance release of 
Python 3.8.

The Python 3.8 series is the newest feature release of the Python language, and 
it contains many new features and optimizations. You can find Python 3.8.1rc1 
here:

https://www.python.org/downloads/release/python-381rc1/

Assuming no critical problems are found prior to 2019-12-16, the scheduled 
release date for 3.8.1 as well as Ned Deily's birthday, no code changes are 
planned between this release candidate and the final release.

That being said, please keep in mind that this is a pre-release of 3.8.1 and as 
such its main purpose is testing.

See the “What’s New in Python 
3.8” document for more 
information about features included in the 3.8 series. Detailed information 
about all changes made in 3.8.0 can be found in its change log.

Maintenance releases for the 3.8 series will continue at regular bi-monthly 
intervals, with 3.8.2 planned for February 2020.

We hope you enjoy Python 3.8!

Thanks to all of the many volunteers who help make Python Development and these 
releases possible! Please consider supporting our efforts by volunteering 
yourself or through organization contributions to the Python Software 
Foundation.

https://www.python.org/psf/

___
python-committers mailing list -- 
[email protected]
To unsubscribe send an email to 
[email protected]
https://mail.python.org/mailman3/lists/python-committers.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/IGJ6ZOAOT2WFY5ZIPRQNTHOSUMPUAO2H/
Code of Conduct: https://www.python.org/psf/codeofconduct/

___
Python-Dev mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/[email protected]/message/YX4DTT26DNYQFUBW5EHPODYMRGT4ZOQX/
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-Dev] Python 2.6.2 and 3.0.2

2009-03-14 Thread Jason R. Coombs
I'm still holding my breath for Python 2.6.2, which fixes a Windows DLL linking 
issue that was already fixed in 3.0.1.  Obviously, the proposed schedule has 
passed, but I would prefer a release sooner than later.

Of course, that's just my preference.

Regards,
Jason

> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Thinking again about 3.0.2.
>
> If we'd like to do bug fix releases before Pycon, I suggest Monday March
> 9th for code freeze and tagging.  That would mean a Tuesday March 10th
> release.
>
> What do you think?

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Mercurial, linefeeds, and Visual Studio

2009-06-04 Thread Jason R. Coombs
I just wanted to share my experience with the mercurial checkout.  I cloned 
http://code.python.org/hg/branches/py3k to continue work on 
http://bugs.python.org/issue1578269 but I found that when I click on 
PC/VS8.0/pcbuild.sln, nothing happens.

This appears to be due to a bug/limitation in vslauncher in that it doesn't 
recognize LF as a line separator.  vslauncher is the default association for 
sln files and its purpose is to parse out the .sln file and launch it with the 
appropriate Visual Studio version based on the header.  What makes matters 
worse is if vslauncher fails to recognize the format, it does nothing, so it 
just appears as if the file fails to launch anything.

It seems that within the hg repository, everything has been converted to LF for 
line endings.  I suspect this is because HG provides no integrated support for 
line-ending conversions and because the hg to svn bridge is probably running on 
a Unix OS.

So converting the pcbuild.sln file to CRLF line endings resolved the problem 
and the file would launch normally.  Also, without conversion, it was possible 
to open the .sln file in Visual Studio explicitly.

I wanted to share this with the community in case anyone else runs into this 
issue.  Also, if there's a recommended procedure for addressing this issue (and 
others that might arise due to non-native line endings), I'd be interested to 
hear it.

Regards,
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] functools.compose to chain functions together

2009-08-14 Thread Jason R. Coombs
I'd like to express additional interest in python patch 1660179, discussed
here:

 

http://mail.python.org/pipermail/patches/2007-February/021687.html

 

On several occasions, I've had the desire for something like this.  I've
made due with lambda functions, but as was mentioned, the lambda is clumsy
and harder to read than functools.compose would be.

 

A potentially common use-case is when a library has a multi-decorator use
case in which they want to compose a meta decorator out of one or more
individual decorators.

 

Consider the hypothetical library.

 

# we have three decorators we use commonly

def dec_register_function_for_x(func):

# do something with func

return func

 

def dec_alter_docstring(func):

# do something to func.__doc__

return func

 

def inject_some_data(data):

def dec_inject_data(func):

func.data = data # this may not be legal,
but assume it does something useful

return func

return dec_inject_data

 

# we could use these decorators explicitly throughout our project

@dec_register_function_for_x

@dec_alter_docstring

@dec_inject_some_data('foo data 1')

def our_func_1(params):

pass

 

@dec_register_function_for_x

@dec_alter_docstring

@dec_inject_some_data('foo data 2')

def our_func_2(params):

pass

 

For two functions, that's not too onerous, but if it's used throughout the
application, it would be nice to abstract the collection of decorators.  One
could do this with lambdas.

 

def meta_decorator(data):

return lambda func:
dec_register_function_for_x(dec_alter_docstring(dec_inject_some_data(data)(f
unc)))

 

But to me, a compose function is much easier to read and much more
consistent with the decorator usage syntax itself.

 

def meta_decorator(data):

return compose(dec_register_function_for_x, dec_alter_docstring,
dec_inject_some_data(data))

 

The latter implementation seems much more readable and elegant.  One doesn't
even need to know the decorator signature to effectively compose
meta_decorators.

 

I've heard it said that Python is not a functional language, but if that
were really the case, then functools would not exist. In addition to the
example described above, I've had multiple occasions where having a general
purpose function composition function would have simplified the
implementation by providing a basic functional construct. While Python isn't
primarily a functional language, it does have some functional constructs,
and this is one of the features that makes Python so versatile; one can
program functionally, procedurally, or in an object-oriented way, all within
the same language.

 

I admit, I may be a bit biased; my first formal programming course was
taught in Scheme.

 

Nevertheless, I believe functools is the ideal location for a very basic and
general capability such as composition.

 

I realize this patch was rejected, but I'd like to propose reviving the
patch and incorporating it into functools.

 

Regards,

Jason



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Jason R. Coombs
Steven D'Aprano wrote:
> Sent: Sunday, 16 August, 2009 08:15
>
> On Sat, 15 Aug 2009 04:39:03 am Jason R. Coombs wrote:
>
> >
> > def meta_decorator(data):
> > return compose(dec_register_function_for_x, dec_alter_docstring,
> > dec_inject_some_data(data))
>
> Surely that's better written as:
>
> meta_decorator = compose(dec_register_function_for_x,
> dec_alter_docstring, dec_inject_some_data)

I agree. The former looks unnecessarily complicated.

I purposely chose a non-trivial use case, one which involves a decorator that 
requires a parameter and thus must be called first before the actual decorator 
is returned.  I think for this reason, the former syntax must be used so that 
the meta_decorator also takes the data parameter and constructs the proper 
"inject" decorator.  Put another way, both dec_inject_some_data and 
meta_decorator are more like decorator factories.

I suspect a simpler, and more common use-case would be like the one you 
described, where either data is global or the "inject" decorator is not used:

meta_decorator = compose(dec_register_function_for_x, dec_alter_docstring)

>
> Mine wasn't -- I've never even used Scheme, or Lisp, or any other
> functional language. But I've come to appreciate Python's functional
> tools, and would like to give a +0.5 to compose(). +1 if anyone can
> come up with additional use-cases.

Thanks for the interest.  I decided to search through some of my active code 
for lambdas and see if there are areas where I would prefer to be using a 
compose function instead of an explicit lambda/reduce combination.

I only found one such application; I attribute this limited finding to the fact 
that I probably elected for a procedural implementation when the functional 
implementation might have proven difficult to read, esp. with lambda.

1) Multiple string substitutions.  You have a list of functions that operate on 
a string, but you want to collect them into a single operator that can be 
applied to a list of strings.

sub_year = lambda s: s.replace("%Y", "2009")

fix_strings_with_substituted_year = compose(str.strip, textwrap.dedent, 
sub_year)
map(fix_strings_with_substituted_year, target_strings)

Moreover, it would be great to be able to accept any number of substitutions.

substitutions = [sub_year, sub_month, ...]
fix_strings_with_substitutions = compose(str.strip, textwrap.dedent, 
*substitutions)



I did conceive of another possibly interesting use case: vector translation.

Consider an application that performs mathematical translations on 
n-dimensional vectors.  While it would be optimal to use optimized matrix 
operations to perform these translations, for the sake of this example, all we 
have are basic Python programming constructs.

At run-time, the user can compose an experiment to be conducted on his series 
of vectors. To do this, he selects from a list of provided translations and can 
provide his own.  These translations can be tagged as named translations and 
thereafter used as translations themselves.  The code might look something like:

translations = selected_translations + custom_translations
meta_translation = compose(*translations)
save_translation(meta_translation, "My New Translation")

def run_experiment(translation, vectors):
result = map(translation, vectors)
# do something with result

Then, run_experiment can take a single translation or a meta-translation such 
as the one created above. This use-case highlights that a composed functions 
must take and return exactly one value, but that the value need not be a 
primitive scalar.



I'm certain there are other, more obscure examples, but I feel these two use 
cases demonstrate some fairly common potential use cases for something like a 
composition function.

Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Jason R. Coombs


> Raymond Hettinger wrote:
> Sent: Sunday, 16 August, 2009 12:42
>
> [Antoine Pitrou]
> > I also think it would be a nice addition.
> > (but someone has to propose a patch :-))

The patch was proposed and rejected here: http://bugs.python.org/issue1660179; 
my reason for mentioning it here is because the functionality isn't YAGNI for 
me; It seems like a fundamental capability when employing a functional 
programming paradigm.


> I agree with Martin's reasons for rejecting the feature request
> (see the bug report for his full explanation).  IIRC, the compose()
> idea had come-up and been rejected in previous discussions as well.
>
> At best, it will be a little syntactic sugar (though somewhat odd
> because
> the traditional mathematical ordering of a composition operator is the
> opposite of what intuition would suggest).  At worst, it will be slower
> and less flexible than our normal ways of linking functions together.
>
> IMO, its only virtue is that people coming from functional languages
> are used to having compose.  Otherwise, it's a YAGNI.

Right.  I have great respect for your and Martin's original conclusion.

The reason I came across the old patch was because I was searching for 
something that did exactly what compose does. That is to say, I had a use case 
that was compelling enough that I thought there should be something in 
functools to do what I wanted.  I've encountered this pattern often enough that 
it might be in the stdlib.

As it turns out, it isn't.  For this reason, I wanted to voice my opinion that 
contradicts the conclusion of the previous patch discussion.  Specifically, 
YAGNI doesn't apply to my experiences, and it does seem to have broad, 
fundamental application, especially with respect to functional programming.

I'm not arguing that just because Jason needs it, it should be in the standard 
library.  Rather, I just wanted to express that, like Chris AtLee, I would find 
this function quite useful.

As Steven pointed out, this functionality is desirable even for those without a 
functional programming background.  I'd like to mention also that even though I 
learned to program in Scheme in 1994, I haven't used it since, and I've been 
using Python since 1996, so my affinity for this function is based almost 
entirely from experiences programming in Python and not in a primarily 
functional language.

If the Python community still concurs that 'compose' is YAGNI or otherwise 
undesirable, I understand.  I just wanted to share my experiences and 
motivations as they pertain to the discussion.  If it turns out that it's 
included in the stdlib later, all the better.

Respectfully,
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] new LRU cache API in Py3.2

2010-11-17 Thread Jason R. Coombs
I see now that my previous reply went only to Stefan, so I'm re-submitting,
this time to the list.

> -Original Message-
> From: Stefan Behnel
> Sent: Saturday, 04 September, 2010 04:29
>
> What about adding an intermediate namespace called "cache", so that 
> the new operations are available like this:
> 
>  print get_phone_number.cache.hits
>  get_phone_number.cache.clear()

I agree. While the function-based implementation is highly efficient, the
pure use of functions has the counter-Pythonic effect of obfuscating the
internal state (the same way the 'private' keyword does in Java). A
class-based implementation would be capable of having its state introspected
and could easily be extended. While the functional implementation is a
powerful construct, it fails to generalize well. IMHO, a stdlib
implementation should err on the side of transparency and extensibility over
performance.

That said, I've adapted Hettinger's Python 2.5 implementation to a
class-based implementation. I've tried to keep the performance optimizations
in place, but instead of instrumenting the wrapped method with lots of
cache_* functions, I simply attach the cache object itself, which then
provides the interface suggested by Stefan. This technique allows access to
the cache object and all of its internal state, so it's also possible to do
things like:

get_phone_number.cache.maxsize += 100

or

if get_phone_number.cache.store:
do_something_interesting()

These techniques are nearly impossible in the functional implementation, as
the state is buried in the locals() of the nested functions.

I'm most grateful to Raymond for contributing this to Python; On many
occasions, I've used the ActiveState recipes for simple caches, but in
almost every case, I've had to adapt the implementation to provide more
transparency. I'd prefer to not have to do the same with the stdlib.

Regards,
Jason R. Coombs

# modified from 
http://code.activestate.com/recipes/498245-lru-and-lfu-cache-decorators/

import collections
import functools
from itertools import ifilterfalse
from heapq import nsmallest
from operator import itemgetter

class Counter(dict):
'Mapping where default values are zero'
def __missing__(self, key):
return 0

class LRUCache(object):
'''
Least-recently-used cache decorator.

Arguments to the cached function must be hashable.
Cache performance statistics stored in .hits and .misses.
Clear the cache with .clear().
Cache object can be accessed as f.cache, where f is the decorated
 function.
http://en.wikipedia.org/wiki/Cache_algorithms#Least_Recently_Used

'''

def __init__(self, maxsize=100):
self.maxsize = maxsize
self.maxqueue = maxsize*10
self.store = dict() # 
mapping of args to results
self.queue = collections.deque()# order that keys have 
been used
self.refcount = Counter()   # times each 
key is in the queue

self.hits = self.misses = 0

def decorate(self, user_function,
len=len, iter=iter, tuple=tuple, sorted=sorted,
KeyError=KeyError):

sentinel = object() # marker for looping around the queue
kwd_mark = object() # separate positional and keyword 

# lookup optimizations (ugly but fast)
store = self.store
queue = self.queue
refcount = self.refcount
queue_append, queue_popleft = queue.append, queue.popleft
queue_appendleft, queue_pop = queue.appendleft, queue.pop

@functools.wraps(user_function)
def wrapper(*args, **kwds):
# cache key records both positional and keyword args
key = args
if kwds:
key += (kwd_mark,) + tuple(sorted(kwds.items()))

# record recent use of this key
queue_append(key)
refcount[key] += 1

# get cache entry or compute if not found
try:
result = store[key]
self.hits += 1
except KeyError:
result = user_function(*args, **kwds)
store[key] = result
self.misses += 1

# purge least recently used cache entry
if len(store) > self.maxsize:
key 

Re: [Python-Dev] [Python-checkins] cpython (2.7): PDB now will properly escape backslashes in the names of modules it executes.

2011-12-06 Thread Jason R. Coombs
Éric, These are all good suggestions. I'll make them at some point.

Thanks.

> -Original Message-
> From: [email protected] [mailto:python-
> [email protected]] On Behalf Of Éric Araujo
> Sent: Friday, 18 November, 2011 10:10
> To: [email protected]
> Subject: Re: [Python-Dev] [Python-checkins] cpython (2.7): PDB now will
> properly escape backslashes in the names of modules it executes.
> 
> Hi Jason,
> 
> > http://hg.python.org/cpython/rev/f7dd5178f36a
> > branch:  2.7
> > user:Jason R. Coombs 
> > date:Thu Nov 17 18:03:24 2011 -0500
> > summary:
> >   PDB now will properly escape backslashes in the names of modules it
> > executes. Fixes #7750
> 
> > diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py
> > +class Tester7750(unittest.TestCase):
> I think we have an unwritten rule that test class and method names should
> tell something about what they test.  (We do have things like TestWeirdBugs
> and test_12345, but I don’t think it’s a useful pattern to follow :)  Not a 
> big
> deal anyway.
> 
> > +# if the filename has something that resolves to a python
> > +#  escape character (such as \t), it will fail
> > +test_fn = '.\\test7750.py'
> > +
> > +msg = "issue7750 only applies when os.sep is a backslash"
> > [email protected](os.path.sep == '\\', msg)
> > +def test_issue7750(self):
> > +with open(self.test_fn, 'w') as f:
> > +f.write('print("hello world")')
> > +cmd = [sys.executable, '-m', 'pdb', self.test_fn,]
> > +proc = subprocess.Popen(cmd,
> > +stdout=subprocess.PIPE,
> > +stdin=subprocess.PIPE,
> > +stderr=subprocess.STDOUT,
> > +)
> > +stdout, stderr = proc.communicate('quit\n')
> > +self.assertNotIn('IOError', stdout, "pdb munged the
> > + filename")
> Why not check for assertIn(filename, stdout)?  (In other words, check for
> intended behavior rather than implementation of the erstwhile bug.)
> 
> BTW, I’ve just tested that giving a message argument to assertNotIn (the
> third argument), unittest still displays the other arguments to allow for 
> easier
> debugging.  I didn’t know that, it’s cool!
> 
> > +def tearDown(self):
> > +if os.path.isfile(self.test_fn):
> > +os.remove(self.test_fn)
> In my own tests, I’ve become fond of using “self.addCleanup(os.remove,
> filename)”: It’s shorter that a tearDown and is right there on the line that
> follows or precedes the file creation.
> 
> >  if __name__ == '__main__':
> >  test_main()
> > +unittest.main()
> This looks strange.
> 
> Regards
> ___
> Python-Dev mailing list
> [email protected]
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-
> dev/jaraco%40jaraco.com


smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Script(s) for building Python on Windows

2012-01-16 Thread Jason R. Coombs
The current scripts for building Python lack some things to be desired.

 

The first thing I notice when I try to build Python on Windows is the
scripts expect to be run inside of a Visual Studio environment, the
environment of which is only defined inside of a cmd.exe context. This means
the scripts can't be executed from within Powershell (my preferred shell on
Windows). One must first shell out to cmd.exe, which disables any
Powershell-specific features the developer might have installed (aliases,
functions, etc).

 

The second thing I notice is the scripts assume Visual Studio 2008. And
while I recognize that Python is specifically built against Visual Studio
2008 for the official releases and that Visual Studio 2008 may be the only
officially-supported build environment, later releases, such as Visual
Studio 2010 are also adequate for testing purposes. I've been developing
Python against Visual Studio 2010 for quite a while and it seems to be more
than adequate. And while it's not the responsibility of the scripts to
accommodate such environments, if the scripts could allow for such
environments, that would be nice. Furthermore, having scripts that codify
the process to upgrade will facilitate the migration should someone make the
decision to officially upgrade to Visual Studio 2010.

 

The third thing that I notice is that the command-line argument handling by
the batch scripts is clumsy (compared to argparse, for example). This
clumsiness is not a criticism of the authors, who have done well with the
tools they had. However, batch programming is probably one of the least
powerful ways to automate builds these days.

 

So to ease my experience, I've developed my own library of functions and
commands to facilitate building Python that aren't subject to the above
limitations. Of course, I built these in Python, so they do require Python
to build Python (not a huge burden, but worth mentioning). All of these
modules are open-source and part of the jaraco.develop package
<http://pypi.python.org/pypi/jaraco.develop> .

 

The first of these modules is jaraco.develop.vstudio
<https://bitbucket.org/jaraco/jaraco.develop/src/b7263c9d9c93/jaraco/develop
/vstudio.py> . It exposes a class for locating Visual Studio in the usual
locations, loading the environment for that instance of Visual Studio, and
upgrading a project or solution file to that version. This class in
particular enables running Visual Studio commands (including msbuild) from
within a Visual Studio environment without actually requiring a cmd.exe
context with that environment.

 

Another module is jaraco.develop.python
<https://bitbucket.org/jaraco/jaraco.develop/src/b7263c9d9c93/jaraco/develop
/python.py> , which includes build_python, a function (and command) to build
Python using whatever version of Visual Studio can be found (9 or 10
required). It has no environmental requirements except that Visual Studio be
installed. Simply run build-python (part of jaraco.develop's console
scripts) and it will build PCbuild.sln from the current directory to
whatever targets are specified (or all of them if none are specified). The
builder currently makes some assumptions (such as always building the 64-bit
Release targets), but those could easily be customized using argparse
parameters.

 

This package and these modules have been tested and run on Python 2.7+.
These tools solve the three shortcomings I mentioned above and make the
development process so much smoother, IMO. If these modules were built into
the repository, building Python could be as simple as "hg clone; cd
cpython/pcbuild; ./build.py" (assuming only Visual Studio and Python
available).

 

I'd like to propose migrating this functionality (mainly these two modules)
into the CPython heads for Python 2.7, 3.1, 3.2, and default as
PCbuild/build.py (or similar). This functionality doesn't necessarily need
to supersede the existing scripts (env, build_env, build), though it
certainly could (and would as far as my usage is concerned).

 

If there are no objections, I'll work to extract the aforementioned
functionality from the jaraco.develop modules and into a portable script and
put together a proof-of-concept in the default branch. The build script
should not interfere with any build bots or other existing build processes,
but should enable another more powerful technique for producing builds.

 

I look forward to your comments and feedback.

 

Regards,

Jason



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Script(s) for building Python on Windows

2012-01-16 Thread Jason R. Coombs
> From: Brian Curtin [mailto:[email protected]]
> Sent: Monday, 16 January, 2012 15:20
>
> 2010 is adequate for limited use but the test suite doesn't pass, so I
would be
> hesitant to add support and/or documentation for building with it until we
> actually support it the same as or in place of 2008.

Good point. The current tools don't automatically support 2010; an extra
command is require to perform the conversion. I'll be cautious and not
expose that functionality without some indication to the user of the
limitations.



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Script(s) for building Python on Windows

2012-01-16 Thread Jason R. Coombs
> From: "Martin v. Löwis" [mailto:[email protected]]
> Sent: Monday, 16 January, 2012 16:25
>
> I'd be hesitant to put too many specialized tools into the tree that will
> become unmaintained. Please take a look at the vs9to8 tool in PCbuild; if
you
> could adjust that to support VS 10, it would be better IMO.

Are you suggesting creating vs10to9, which would be congruent to vs9to8, or
vs9to10?

I'm unsure if the conversion from 9 to 10 or 10 to 9 can be as simple as the
vs9to8 suggests. When I run the upgrade using the Visual Studio tools, it
does upgrade the .sln file [as so]( http://a.libpa.st/kB19G). But as you can
see, it also converts all of the .vcproj to .vcxproj, which appears to be a
very different schema. According to [this article](
http://social.msdn.microsoft.com/Forums/en/vsprereleaseannouncements/thread/
4345a151-d288-48d6-b7c7-a7c598d0f85e) it should be trivial to downgrade by
only updating the .sln file (perhaps Visual Studio 2008 is forward
compatible with the .vcxproj format).

I'll look into this more when I have a better idea what you had in mind.

My goal in adding the upgrade code was to provide a one-step upgrade for
developers with only VS 10 installed. That's what vs-upgrade in
jaraco.develop does.

> As for completely automating the build: please take notice of
> Tools/buildbot/build.bat. It also fully automates the build, also doesn't
> require that the VS environment is already activated, and has the
additional
> advantage of not requiring Python to be installed.

That's interesting, but it still suffers from several shortcomings:

1) It still assumes Visual Studio 2008 and fails with an obscure error
otherwise.
2) You can't use it to build different targets (only the whole solution).
3) It automatically downloads the external dependencies (it'd be nice to
build without them on occasion).
4) It's still a batch file, so still gives the abominable "Terminate batch
job (Y/N)?" when cancelling any operation via Ctrl+C.
5) This functionality isn't in PCBuild/*. Why not?
6) There's no good way to select which type to build (64-bit versus 32-bit,
release versus debug). Adding these command-line options is clumsy in batch
files.
7) Since it's written in batch script, Python programmers might be hesitant
to work with it (improve it).

For a buildbot, the batch file is perfectly adequate. It should do the same
thing every time reliably.

For anyone but a robot or seasoned CPython Windows developer, however, the
build tools are not intuitive, and I find that I'm constantly tweaking the
batch scripts and asking myself, "why couldn't this be in Python, which is a
much more powerful language?" This is why I developed the scripts, and my
thought is they could be useful to others as well.

My hope is they might even supersede the existing scripts and become
canonical, in which case there would be no possibility of them becoming
unmaintained. If it turns out that they do become unused and unmaintained,
they can be removed, but my feeling is since they're concise, documented,
Python scripts, they'd be more likely to be maintained than their '.bat'
counterparts.



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Script(s) for building Python on Windows

2012-01-16 Thread Jason R. Coombs
> From: [email protected] [mailto:python-
> [email protected]] On Behalf Of Jason R. Coombs
> Sent: Monday, 16 January, 2012 19:01
>
> I'm unsure if the conversion from 9 to 10 or 10 to 9 can be as simple as
the
> vs9to8 suggests. When I run the upgrade using the Visual Studio tools, it
does
> upgrade the .sln file [as so]( http://a.libpa.st/kB19G). But as you can
see, it also
> converts all of the .vcproj to .vcxproj, which appears to be a very
different
> schema. According to [this article](
> http://social.msdn.microsoft.com/Forums/en/vsprereleaseannouncements/thre
> ad/
> 4345a151-d288-48d6-b7c7-a7c598d0f85e) it should be trivial to downgrade by
> only updating the .sln file (perhaps Visual Studio 2008 is forward
compatible
> with the .vcxproj format).

I upgraded the solution file using Visual Studio, then followed those
instructions suggested by the article, but the solution no longer builds
under Visual Studio 2008, so apparently that answer is incorrect.

Perhaps it's possible to upgrade the .sln in a less aggressive way than the
Visual Studio tools do by default, but my initial experience suggests it
won't be as easy to upgrade/downgrade the solution file as it was between
VS8/VS9.


smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Virtualenv not portable from Python 2.7.2 to 2.7.3 (os.urandom missing)

2012-03-28 Thread Jason R. Coombs
I see this was reported as a debian bug.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=665776

 

We encountered it as well.

 

To reproduce, using virtualenv 1.7+ on Python 2.7.2 on Ubuntu, create a
virtualenv. Move that virtualenv to a host with Python 2.7.3RC2 yields:

 

jaraco@vdm-dev:~$ /usr/bin/python2.7 -V

Python 2.7.3rc2

jaraco@vdm-dev:~$ env/bin/python -V

Python 2.7.2

jaraco@vdm-dev:~$ env/bin/python -c "import os; os.urandom()"

Traceback (most recent call last):

  File "", line 1, in 

AttributeError: 'module' object has no attribute 'urandom'

 

This bug causes Django to not start properly (under some circumstances).

 

I reviewed the changes between v2.7.2 and 2.7 (tip) and it seems there was
substantial refactoring of the os and posix modules for urandom.

 

I still don't fully understand why the urandom method is missing (because
the env includes the python 2.7.2 executable and stdlib).

 

I suspect this change is going to cause some significant backward
compatibility issues. Is there a recommended workaround? Should I file a
bug?

 

Regards,

Jason



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Virtualenv not portable from Python 2.7.2 to 2.7.3 (os.urandom missing)

2012-03-28 Thread Jason R. Coombs
> -Original Message-
> From: [email protected] [mailto:python-
> [email protected]] On Behalf Of Carl Meyer
> Sent: Wednesday, 28 March, 2012 14:48
> 
> The workaround is easy: just re-run virtualenv on that path with the new
> interpreter.
> 

Thanks for the quick response Carl. I appreciate all the work that's been
done.

I'm not sure the workaround is as simple as you say. Virtualenv doesn't
replace the 'python' exe if it already exists (because it may already exist
for a different minor version of Python (3.2, 2.6)). So the procedure is
probably something like this:

For each  version of Python the virtualenv wraps (ls
env/bin/python?.?):
1) Run env/bin/python -V. If the result starts with "Python ", remove
env/bin/python.
2) Determine if that Python version uses distribute or setuptools.
3) Run virtualenv --python=python env (with --distribute if
appropriate)

I haven't yet tested this procedure, but I believe it's closer to what will
need to be done. There are probably other factors. Unfortunately, to
reliably repair the virtualenv is very difficult, so we will probably opt
with re-deploying all of our virtualenvs.

Will the release notes include something about this change, since it will
likely have broad backward incompatibility for all existing virtualenvs? I
wouldn't expect someone in operations to read the virtualenv news to find
out what things a Python upgrade will break. Indeed, this update will
probably be pushed out as part of standard, unattended system updates.

I realize that the relationship between stdlib.os and posixmodule isn't a
guaranteed interface, and the fact that it breaks with virtualenv is a
weakness of virtualenv. Nevertheless, virtualenv has become the defacto
technique for Python environments. Putting my sysops cap on, I might
perceive this change as being unannounced (w.r.t. Python) and having
significant impact on operations. I would think this impact deserves at
least a note in the release notes.

Regards,
Jason


smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Virtualenv not portable from Python 2.7.2 to 2.7.3 (os.urandom missing)

2012-03-29 Thread Jason R. Coombs
Carl,

I've drafted some notes: http://piratepad.net/PAZ3CEq9CZ

Please feel free to edit them. If you want to chat, I can often be reached
on freenode as 'jaraco' or XMPP at my e-mail address if you want to sprint
on this in real-time.

Does the issue only exist for Python 2.6 and 2.7?

I'm not familiar with the release process. What's the next step?


> -Original Message-
> From: R. David Murray [mailto:[email protected]]
> Sent: Wednesday, 28 March, 2012 17:46
> 
> I think it is reasonable to put something in the release notes.  This
change is
> much larger than changes we normally make in maintenance release,
> because it fixes a security bug.  But because it is larger than normal,
adding
> release notes like this about known breakage is, I think, a good idea.
> 
> Perhaps you and Carl could collaborate on a page explaining the issue in
> detail, and on a brief note to include in the release notes that points to
your
> more extensive discussion?



smime.p7s
Description: S/MIME cryptographic signature
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


  1   2   >