On Jun 16, 5:04 am, "Diez B. Roggisch" <[EMAIL PROTECTED]> wrote:
> Diez B. Roggisch wrote: > > George Sakkis schrieb: > >> I have a situation where one class can be customized with several > >> orthogonal options. Currently this is implemented with (multiple) > >> inheritance but this leads to combinatorial explosion of subclasses as > >> more orthogonal features are added. Naturally, the decorator pattern > >> [1] comes to mind (not to be confused with the the Python meaning of > >> the term "decorator"). > > >> However, there is a twist. In the standard decorator pattern, the > >> decorator accepts the object to be decorated and adds extra > >> functionality or modifies the object's behavior by overriding one or > >> more methods. It does not affect how the object is created, it takes > >> it as is. My multiple inheritance classes though play a double role: > >> not only they override one or more regular methods, but they may > >> override __init__ as well. Here's a toy example: > > >> class Joinable(object): > >> def __init__(self, words): > >> self.__words = list(words) > >> def join(self, delim=','): > >> return delim.join(self.__words) > > >> class Sorted(Joinable): > >> def __init__(self, words): > >> super(Sorted,self).__init__(sorted(words)) > >> def join(self, delim=','): > >> return '[Sorted] %s' % super(Sorted,self).join(delim) > > >> class Reversed(Joinable): > >> def __init__(self, words): > >> super(Reversed,self).__init__(reversed(words)) > >> def join(self, delim=','): > >> return '[Reversed] %s' % super(Reversed,self).join(delim) > > >> class SortedReversed(Sorted, Reversed): > >> pass > > >> class ReversedSorted(Reversed, Sorted): > >> pass > > >> if __name__ == '__main__': > >> words = 'this is a test'.split() > >> print SortedReversed(words).join() > >> print ReversedSorted(words).join() > > >> So I'm wondering, is the decorator pattern applicable here ? If yes, > >> how ? If not, is there another way to convert inheritance to > >> delegation ? > > > Factory - and dynamic subclassing, as shown here: > > > import random > > > class A(object): > > pass > > > class B(object): > > pass > > > def create_instance(): > > superclasses = tuple(random.sample([A, B], random.randint(1, 2))) > > class BaseCombiner(type): > > > def __new__(mcs, name, bases, d): > > bases = superclasses + bases > > return type(name, bases, d) > > > class Foo(object): > > __metaclass__ = BaseCombiner > > return Foo() > > > for _ in xrange(10): > > f = create_instance() > > print f.__class__.__bases__ > > Right now I see of course that I could have spared myself the whole > __metaclass__-business and directly used type()... Oh well, but at least it > worked :) > > Diez Ok, I see how this would work (and it's trivial to make it cache the generated classes for future use) but I guess I was looking for a more "mainstream" approach, something that even a primitive statically typed language could run :) Even in Python though, I think of Runtime Type Generation like eval(); it's good that it exists but it should be used as a last resort. Also RTG doesn't play well with pickling. Since I don't have many useful subclasses so far, I'll stick with explicit inheritance for now but I'll consider RTG if the number of combinations becomes a real issue. George -- http://mail.python.org/mailman/listinfo/python-list