Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Nick Coghlan
Nick Coghlan wrote:
> Should GeneratorExit inherit from Exception or BaseException?
> 
> Currently, a generator that catches Exception and continues on to yield 
> another value can't be closed properly (you get a runtime error pointing out 
> that the generator ignored GeneratorExit).
> 
> The only decent reference I could find to it in the old PEP 348/352 
> discussions is Guido writing [1]:
> 
>> when GeneratorExit or StopIteration
>> reach the outer level of an app, it's a bug like all the others that
>> bare 'except:' WANTS to catch.
> 
> (at that point in the conversation, I believe bare except was considered the 
> equivalent of "except Exception:")
> 
> While I agree with what Guido says about GeneratorExit being a bug if it 
> reaches the outer level of an app, it seems like a bit of a trap that a 
> correctly written generator can't write "except Exception:" without preceding 
> it with an "except GeneratorExit:" that reraises the exception. Isn't that 
> exactly the idiom we're trying to get rid of for SystemExit and 
> KeyboardInterrupt?

The last comment I heard from Guido on this topic was that he was still 
thinking about it.

However, I now have an additional data point - if GeneratorExit inherits 
directly from BaseException, it makes it much easier to write exception 
handling code in generators that does the right thing on both Python 2.4 and 
2.5.

In 2.4, PEP 342 hasn't happened, so "except Exception:" can't misbehave in 
response to GeneratorExit (the latter doesn't exist, and nor does generator 
finalisation). If GeneratorExit inherits directly from BaseException, the code 
still does the right thing since the exception isn't caught.

OTOH, if GeneratorExit inherits from Exception (as in current SVN), then two 
things will be needed to make the generator work correctly:

1. add a preceding exception clause to fix Python 2.5 behaviour:
   except GeneratorExit:
   raise
   except Exception:
   # whatever

2. add header code to the module to make it work again on Python 2.4:

   try:
   GeneratorExit
   except NameError:
   class GeneratorExit(Exception): pass

IMO, that would be an ugly bit of backwards incompatibility (even though I 
wouldn't expect such broad exception handling in generators to be at all 
common).

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://www.boredomandlaziness.org
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Guido van Rossum
On 3/25/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> The last comment I heard from Guido on this topic was that he was still
> thinking about it.

Not exactly. I'm delegating the thinking mostly to others.

> However, I now have an additional data point - if GeneratorExit inherits
> directly from BaseException, it makes it much easier to write exception
> handling code in generators that does the right thing on both Python 2.4 and 
> 2.5.
>
> In 2.4, PEP 342 hasn't happened, so "except Exception:" can't misbehave in
> response to GeneratorExit (the latter doesn't exist, and nor does generator
> finalisation). If GeneratorExit inherits directly from BaseException, the code
> still does the right thing since the exception isn't caught.
>
> OTOH, if GeneratorExit inherits from Exception (as in current SVN), then two
> things will be needed to make the generator work correctly:
>
> 1. add a preceding exception clause to fix Python 2.5 behaviour:
>except GeneratorExit:
>raise
>except Exception:
># whatever
>
> 2. add header code to the module to make it work again on Python 2.4:
>
>try:
>GeneratorExit
>except NameError:
>class GeneratorExit(Exception): pass
>
> IMO, that would be an ugly bit of backwards incompatibility (even though I
> wouldn't expect such broad exception handling in generators to be at all 
> common).

I can't see all that much use for GeneratorExit in code that needs to
be compatible with 2.4, since the rest of the machinery that makes
exception handling around yield feasible doesn't exist.

Rather than speaking of "data points" which are really just "ideas",
try to come up with a data point that represents an actual (not
made-up) use case to show the difference.

I'm saying this because, while I believe there *may* be something
here, I also believe that the decision to derive an exception from
BaseException instead of Exception should not be taken lightly -- lest
we set the wrong example and render the nice feature we're trying to
create (that "except Exception"does the right thing almost all of the
time) useless.

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Nick Coghlan
Guido van Rossum wrote:
> On 3/25/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
>> OTOH, if GeneratorExit inherits from Exception (as in current SVN), then two
>> things will be needed to make the generator work correctly:
>>
>> 1. add a preceding exception clause to fix Python 2.5 behaviour:
>>except GeneratorExit:
>>raise
>>except Exception:
>># whatever
>>
>> 2. add header code to the module to make it work again on Python 2.4:
>>
>>try:
>>GeneratorExit
>>except NameError:
>>class GeneratorExit(Exception): pass
>>
>> IMO, that would be an ugly bit of backwards incompatibility (even though I
>> wouldn't expect such broad exception handling in generators to be at all 
>> common).
> 
> I can't see all that much use for GeneratorExit in code that needs to
> be compatible with 2.4, since the rest of the machinery that makes
> exception handling around yield feasible doesn't exist.

I agree entirely - my goal is to make sure it stays that way.

The kind of code I'm talking about would be an *existing* Python 2.4 generator 
that happens to do something like:

   def gen(tasks):
   """yield the results of a bunch of task functions"""
   for task in tasks:
   try:
   yield (task, task())
   except Exception, ex:
   yield ExceptionOccurred(task, ex)


If you run such a generator on Python 2.5, but don't run it to completion 
before it is garbage collected, you will get an error message printed on 
stderr saying that an exception was ignored when this generator was cleaned 
up. If you use the new PEP 342 features to try to explicitly close it before 
it is garbage collected, you'll get the exception directly.

The culprit is the RuntimeError raised when the generator's close() method 
gets upset because the generator swallowed GeneratorExit.

If GeneratorExit inherits directly from BaseException, such unexpected 
behaviour won't happen - the only way for an existing generator to break is if 
it contained a bare except clause, and that code was *already* dubious (e.g. 
it probably swallowed KeyboardInterrupt).

I don't have any actual live examples of a generator with a broad exception 
clause like the one above, but toy generators like the one above are legal in 
2.4 and result in spurious errors with current SVN.

Regards,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://www.boredomandlaziness.org
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Raymond Hettinger
>> I can't see all that much use for GeneratorExit in code that needs to
>> be compatible with 2.4, since the rest of the machinery that makes
>> exception handling around yield feasible doesn't exist.
>
> I agree entirely - my goal is to make sure it stays that way.
>
> The kind of code I'm talking about would be an *existing* Python 2.4 generator
> that happens to do something like:
>
>   def gen(tasks):
>   """yield the results of a bunch of task functions"""
>   for task in tasks:
>   try:
>   yield (task, task())
>   except Exception, ex:
>   yield ExceptionOccurred(task, ex)
>
>
> If you run such a generator on Python 2.5, but don't run it to completion
> before it is garbage collected, you will get an error message printed on
> stderr saying that an exception was ignored when this generator was cleaned
> up. If you use the new PEP 342 features to try to explicitly close it before
> it is garbage collected, you'll get the exception directly.
>
> The culprit is the RuntimeError raised when the generator's close() method
> gets upset because the generator swallowed GeneratorExit.
>
> If GeneratorExit inherits directly from BaseException, such unexpected
> behaviour won't happen - the only way for an existing generator to break is if
> it contained a bare except clause, and that code was *already* dubious (e.g.
> it probably swallowed KeyboardInterrupt).
>
> I don't have any actual live examples of a generator with a broad exception
> clause like the one above, but toy generators like the one above are legal in
> 2.4 and result in spurious errors with current SVN.

I can't say that I care enough about this hypothetical inter-version flimflam 
to 
warrant mucking-up the otherwise useful distinction between Exception and 
BaseException.

special-cases-aren't-special-enough ...


Raymond 
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Guido van Rossum
On 3/25/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> The kind of code I'm talking about would be an *existing* Python 2.4 generator
> that happens to do something like:
>
>def gen(tasks):
>"""yield the results of a bunch of task functions"""
>for task in tasks:
>try:
>yield (task, task())
>except Exception, ex:
>yield ExceptionOccurred(task, ex)

This is purely hypothetical. It doesn't look like good style at all.

> If you run such a generator on Python 2.5, but don't run it to completion
> before it is garbage collected, you will get an error message printed on
> stderr saying that an exception was ignored when this generator was cleaned
> up. If you use the new PEP 342 features to try to explicitly close it before
> it is garbage collected, you'll get the exception directly.

I think this is fine. The code breaks with the new yield semantics.
But that's because the except clause was overly broad. It's easy to
rewrite it like this, which is better style anyway because the scope
of the try/except is limited.

  try:
value = (task, task())
  except Exception, ex:
value = ExceptionOccurred(task, ex)
  yield value

> The culprit is the RuntimeError raised when the generator's close() method
> gets upset because the generator swallowed GeneratorExit.
>
> If GeneratorExit inherits directly from BaseException, such unexpected
> behaviour won't happen - the only way for an existing generator to break is if
> it contained a bare except clause, and that code was *already* dubious (e.g.
> it probably swallowed KeyboardInterrupt).
>
> I don't have any actual live examples of a generator with a broad exception
> clause like the one above, but toy generators like the one above are legal in
> 2.4 and result in spurious errors with current SVN.

I don't want to cater to hypotheticals. Unless you find real code out
there doing this kind of thing I don't believe the problem is real.

I like to resolve corner cases so that *likely* situations are handled
reasonably.

Just in case you feel inclined to argue this further, let me argue
that there's also a *downside* to making GeneratorExit inherit from
BaseException: if it ever "leaks" out of some code that was supposed
to catch it but somehow didn't, and there's an outer "except
Exception:" trying to protect against buggy code, that except clause
is bypassed.

So perhaps we can turn this into a requirement for exceptions that
inherit from BaseException instead of Exception: the chance that they
get raised by buggy code should be nihil. I think that SystemExit and
KeyboardExit both qualify -- the former is raised by *non-buggy* code
with the intention of falling all the way through; the latter is not
raised by code at all but by the end user.

I don't think GeneratorExit qualifies.

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PySet API

2006-03-25 Thread Barry Warsaw
On Tue, 2006-03-21 at 21:31 -0500, Raymond Hettinger wrote: 
> [Barry]
> > Is it your intent to push for more use of the abstract API instead of
> > the concrete APIs for all of Python's C data structures?  Current API
> > aside, are you advocating this approach for all new built-in types?
> > Would you argue that Python 3.0's C API be stripped of everything but
> > the abstract API and the bare essentials of the concrete API?
> 
> It's not up to me.  Perhaps someone else can chime-in about the philosophy
> of how the C API is supposed to balance abstract and concrete APIs.

I think it's an important point to discuss, both for trying to resolve
this impasse and for helping to direct future API designs, especially as
we get into Python 3.0.

Maybe it will help you to understand why I want a richer concrete API.
I work on an app that is deeply integrated with Python.  It's hard to
say whether we embed or extend -- it's a lot of both.  We use Python
data structures such as lists, dicts, and sets in many places as our
fundamental tracking objects.  So we know what we have, i.e. it's
definitely a set here and another set there, and we want to merge one
into the other.  Or we know we have a set of foo's here and we need to
iterate over them quickly (yes, in a tight loop) to count things or
whatever.  

So there's no question that a concrete API is very useful to us.  And
there's no questions that snaking through the abstract API causes us
real debugging pain (a point which you mostly glossed over).  We
understand the gotchas about reference counting and the possibilities
and implications about calling back into Python.  Remember, we're all
consenting adults here.  I don't think we're unique here, as the rich
concrete API of other fundamental Python objects attests to.

Your comments lead me to think that you aren't taking this important use
case into account.  You talk about duck typing, but I don't care about
that here.  I absolutely know I have a PySet, so why cause me pain to
use it?

> I know that the more one uses the abstract API, the more likely the code
> is going to be able to accept duck typed inputs.  Also, most things that
> have tp_slots have a corresponding abstract method instead of tons
> a concrete access points; hence, I would be supportive if you proposed a
> PyObject_Clear(o) function (for calling tp_clear slots when they exist and
> returning an error code when they don't).

I wouldn't object to that, but it wouldn't change my mind about
PySet_Clear().  I'm not arguing against a rich abstract API, I'm arguing
for having a richer concrete API too.  And in this case, only slightly
richer.  

> For setobject.c, if I still have a say in the matter, my strong preference is 
> to
> keep the API minimal, expose fine-grained functions for efficiency, use
> PyNumber methods for direct access to operator style set operations,
> and use the abstract API for everything else.

I think this is a silly stance.  You agree that PySet_Next() is easier
to use than the iterator API.  We will definitely not use the latter,
and if your position stands, then we'll just have to hack Python to add
it (or implement it in an auxiliary module).  But I don't want to have
to do that, so I really don't understand your reluctance to add three
obviously useful functions.

Another point: these don't expose internal bits of the set
implementation.  Well, except for the opaque position pointer, but
that's still enough data hiding for me because you're never supposed
to /do/ anything with that variable except pass it right back to
PySet_Next().  PySet_Clear() and PySet_Update() don't expose any
implementation details -- that's the whole point!

> P.S.  One other thought:  I don't want to crystalize the API in a way
that 
> precludes
> future development of the module.  One possibility for the future is for 
> updates 
> to take
> multiple arguments such as s.update(t,u,v)  causing three updates to be 
> folded-in at once.

I don't see any way that my proposals preclude that.  And besides, the
three API calls I'm proposing are useful /today/.

But just so we all know what we're talking about, I've uploaded the
patch to SourceForge:

http://sourceforge.net/tracker/index.php?func=detail&aid=1458476&group_id=5470&atid=305470

As with all good patches, there's (almost) more test code than
implementation.

Cheers,
-Barry



signature.asc
Description: This is a digitally signed message part
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PySet API

2006-03-25 Thread Barry Warsaw
On Tue, 2006-03-21 at 22:01 -0500, Raymond Hettinger wrote:
> [Me]
> >  There is a semantic difference between
> > code like s+=t and s.update(t).  The former only works when t is a set
> > and the latter works for any iterable.  When the C code corresponds to
> > the Python code, that knowledge is kept intact and there is no confusion 
> > between
> > PyNumber_InPlaceAdd(s,t) vs PyObject_CallMethod(s, "update", "(O)", t).
> 
> Of course, that should have been s|=t and PyNumber_InPlaceOr().

Heh, my point exactly.  You wouldn't have gotten confused about
PySet_Update(). :)

-Barry



signature.asc
Description: This is a digitally signed message part
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Pickling problems are hard to debug

2006-03-25 Thread Greg Ewing
There seems to be a need for better diagnostics
when pickle encounters something that can't be
pickled.

Recently when attempting to pickle a rather
large and complicated data structure, I got
the following incomprehensible message:

   cPickle.PicklingError: args[0]
 from __newobj__ args has the wrong class

Trying again with protocol 1 instead of 2,
I get

   TypeError: can't pickle function objects

which I'm *guessing* is because somewhere I've
tried to pickle a nested function or a bound
method. But it still doesn't give me any idea
*which* function I tried to pickle or where
abouts it turns up in the data structure.

Anyone have any ideas how the situation could
be improved? At the very least, it could
include some info about the type and identity
of the offending object.

Greg
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pickling problems are hard to debug

2006-03-25 Thread Gary Poster

On Mar 25, 2006, at 8:13 PM, Greg Ewing wrote:

> There seems to be a need for better diagnostics
> when pickle encounters something that can't be
> pickled.
>
> Recently when attempting to pickle a rather
> large and complicated data structure, I got
> the following incomprehensible message:
>
>cPickle.PicklingError: args[0]
>  from __newobj__ args has the wrong class
>
> Trying again with protocol 1 instead of 2,
> I get
>
>TypeError: can't pickle function objects
>
> which I'm *guessing* is because somewhere I've
> tried to pickle a nested function or a bound
> method. But it still doesn't give me any idea
> *which* function I tried to pickle or where
> abouts it turns up in the data structure.
>
> Anyone have any ideas how the situation could
> be improved? At the very least, it could
> include some info about the type and identity
> of the offending object.

You are asking for ideas on how to change the pickle story to help.   
However, just reading your issue, I thought I might have done a  
debugging hack like this, at least for the protocol 1 traceback.

We'll assume that the error is more mysterious than what I've  
manufactured here.

 >>> import cPickle
 >>> cPickle.dumps({'foo': lambda: 42})
Traceback (most recent call last):
   File "", line 1, in ?
   File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ 
python2.4/copy_reg.py", line 69, in _reduce_ex
 raise TypeError, "can't pickle %s objects" % base.__name__
TypeError: can't pickle function objects
 >>> import copy_reg
 >>> def debug(obj):
... import pdb; pdb.set_trace()
...
 >>> import types
 >>> copy_reg.pickle(types.FunctionType, debug)
 >>> cPickle.dumps({'foo': lambda: 42})
--Return--
 > (2)debug()->None
(Pdb) p obj
 at 0x63230>
(Pdb) p obj.__module__
'__main__'

I also might have used pickle, rather than cPickle, to try and see  
what happened, if that ended up being necessary.

I don't use protocol 2 much: that error message in particular looked  
a bit difficult, and my hack might not be any help there.  I agree  
that it would be nice to have a better message there, in particular.

back to lurking...

Gary
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GeneratorExit inheriting from Exception

2006-03-25 Thread Nick Coghlan
Guido van Rossum wrote:
> On 3/25/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
>> The kind of code I'm talking about would be an *existing* Python 2.4 
>> generator
>> that happens to do something like:
>>
>>def gen(tasks):
>>"""yield the results of a bunch of task functions"""
>>for task in tasks:
>>try:
>>yield (task, task())
>>except Exception, ex:
>>yield ExceptionOccurred(task, ex)
> 
> This is purely hypothetical. It doesn't look like good style at all.
> 
>> If you run such a generator on Python 2.5, but don't run it to completion
>> before it is garbage collected, you will get an error message printed on
>> stderr saying that an exception was ignored when this generator was cleaned
>> up. If you use the new PEP 342 features to try to explicitly close it before
>> it is garbage collected, you'll get the exception directly.
> 
> I think this is fine. The code breaks with the new yield semantics.
> But that's because the except clause was overly broad. It's easy to
> rewrite it like this, which is better style anyway because the scope
> of the try/except is limited.
> 
>   try:
> value = (task, task())
>   except Exception, ex:
> value = ExceptionOccurred(task, ex)
>   yield value

Works for me. Consider the issue dropped :)

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://www.boredomandlaziness.org
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PySet API

2006-03-25 Thread Raymond Hettinger
[Barry]
> Maybe it will help you to understand why I want a richer concrete API.
> I work on an app that is deeply integrated with Python.  It's hard to
> say whether we embed or extend -- it's a lot of both.  We use Python
> data structures such as lists, dicts, and sets in many places as our
> fundamental tracking objects.

In such an app, it would be trival to write a header:
#define BarrySet_Clear(s)  PyObject_CallMethod(s, "clear", NULL)

Still, PyObject_Clear(s) would be better.  Better still would be to examine the 
actual uses in the app.  I suspect that most code that clears a set and then 
rebuilds it would be better-off starting with a new empty set (and because of 
freelisting, that is a very fast operation).

Likewise, it only takes a one-line header to define BarrySet_Update(s).  I do 
not want that part of the C API exposed yet.  It is still under development and 
may eventually become a function with a variable length argument list.

It's bogus to say there is some app critical need.  Afterall, these are both 
one-line defines if you personally crave them.  There's no speed argument here 
either -- saving an O(1) dispatch step in an O(n) operation.



> there's no questions that snaking through the abstract API causes us
> real debugging pain

I honestly don't follow you here.  Doesn't your debugger have options for 
step-over and step-into?  Are you debugging the set module or your client code? 
Besides, these are all high volume functions -- do you really want to trace 
through the internal mechanics of set_clear?  Internally, this code has special 
cases for small and large table sizes, it does a pointer swap with an empty 
table to avoid mid-stream resize issues, it treats dummy entries and active 
entries as being the same, and it's not at all beautiful.  Ergo, it is not 
something you want to be tracing through.  The debugging argument is bogus.



> You agree that PySet_Next() is easier to use than the iterator API.
> We will definitely not use the latter, and if your position stands, then
> we'll just have to hack Python to add it (or implement it in an auxiliary 
> module).

If you're dead-set against using the iterator API, then maybe there is 
something 
wrong with the API.  You should probably start a new thread on why you detest 
the iterator API and see if there are ways to improve it.

Avoidance of the iterator protocol is no reason to proliferate the _Next() api 
across other collections.  That would be a mistake.  It is a bug-factory.  Any 
operation which could potentially call back arbitrary Python  code can also 
potentially trigger a resize or table update,  leaving an invalid pointer. 
Something as simple as PyObject_Hash(k) can trigger a callback.  Usually with 
code like this, it would take Armin less than five minutes to write a pure 
Python crasher.

If you absolutely must go against my recommendation, can we compromise with a 
semi-private _PySet_Next() so that you have a hook but without mucking-up the 
public API for the rest of the world?



> You talk about duck typing, but I don't care about that here.

It's one of the virtues of Python that gets reflected in the abstract API.  
IMO, 
it's nice that PyObject_Dir(o) corresponds to "dir(o)" and the same for 
hash(o), 
repr(o), etc.  I just hope that by hardwiring data types in stone, that your 
app 
doesn't become rigid and impossible to change.  I certainly do not recommend 
that other people adopt this coding style (avoidance of iterators, duplication 
of abstact api functions in concrete form, etc.)  If you're experiencing 
debugging pain, it may be that avoidance of abstraction is the root cause.



>> I would be supportive if you proposed a PyObject_Clear(o) function
>> (for calling tp_clear slots when they exist and
>> returning an error code when they don't).
>
> I wouldn't object to that, but it wouldn't change my mind about
> PySet_Clear().

This is plain evidence that something is wrong with your approach.  While 
possibly necessary in your environment, the rest of mankind should not have to 
stomach this kind of API clutter. 

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Prevalence of low-level memory abuse?

2006-03-25 Thread Tim Peters
When Python's small-object memory allocator was introduced, some
horrid hacks came with it  to map PyMem_{Del, DEL} and PyMem_{Free,
FREE} to PyObject_{Free, FREE}.  This was to cater to less than a
handful of extension modules found at the time that obtained memory
for an object via PyObject_{New, NEW}, but released that memory via
the insanely mismatched PyMem_{Del, DEL} or PyMem_{Free, FREE}.

Since such combinations were found rarely in real life, have been
officially forbidden for years, the hacks are ugly & hard to
understand, and the hacks needlessly slow PyMem_{Del, DEL, Free,
FREE}, I'm trying to get rid of them now.  Alas, in a release(*)
build, Python's test suite segfaulted all over the place.

So far I found one smoking gun:  in _subprocess.c, sp_handle_new()
gets memory via PyObject_NEW(), but sp_handle_dealloc() releases that
memory via PyMem_DEL().  That's nuts, and after removing the
now-ancient hacks obtains the memory from obmalloc but releases it
directly to the system free().  That ends up corrupting both
obmalloc's _and_ the platform C library's ultra-low-level memory
bookkeeping bytes.

Since this wasn't common before, has it become common since then :-)? 
I checked Zope and ZODB a long time ago, and there were no
PyMem/PyObject mismatches there.  See any in your code?

(*) Nothing failed in a debug build, since all PyObject_ and PyMem_
calls go thru
obmalloc in a debug build.  Note that, because of this, the buildbot test
runs wouldn't have detected anything wrong.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PySet API

2006-03-25 Thread Aahz
I'd really like to see someone else who understands the issues (i.e.
using the Python C-API) weigh in.  Both Barry and Raymond are clever
programmers who generally understand what's Pythonic, and I find myself
agreeing with whoever posted last.  ;-)  Having another perspective
would probably shed some light here.
-- 
Aahz ([EMAIL PROTECTED])   <*> http://www.pythoncraft.com/

"Look, it's your affair if you want to play with five people, but don't
go calling it doubles."  --John Cleese anticipates Usenet
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PySet API

2006-03-25 Thread Alex Martelli

On Mar 25, 2006, at 9:57 PM, Aahz wrote:

> I'd really like to see someone else who understands the issues (i.e.
> using the Python C-API) weigh in.  Both Barry and Raymond are clever
> programmers who generally understand what's Pythonic, and I find  
> myself
> agreeing with whoever posted last.  ;-)  Having another perspective
> would probably shed some light here.

My general preference is rather well-known, and I quote the advice I  
gave in "Python in a Nutshell"...:
"""
Some of the functions callable on specifically-typed objects [...]  
duplicate functionality that is also available from PyObject_  
functions; in these cases, you should almost invariably use the more  
general PyObject_ function instead. I don’t cover such almost- 
redundant functions in this book.
"""

However, I don't go as far as suggesting PyObject_CallMethod and the  
like... I'd much rather have abstract-layer PyObject_... functions,  
as long as they're applicable to two or more concrete built-in types  
(for example, IMHO adding PyObject_Clear is a no-brainer -- it's  
obviously right).  And I'm on the fence regarding the specific issue  
of PySet_Next.

So, having carefully staked out a position smack in the middle, I  
cheerfully now expect to be fired upon from both sides!-)


Alex

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com