Once again I'm working on code that has several 'providers' of different features, and I thought I'd ask the list what better solutions it has to this other than what I've come up with.
Currently, the cleanest way I can find is to say import foo foo.config(bar='baz') # from this point on, foo.bar is provided by foo.bars.baz this could be widget sets, database connectors, persistance layers, RPC, you name it. This is very easy to do: # foo.py def config(bar_name=DEFAULT_BAR_NAME): global bar bar = __import__('foo.' + bar_name, globals(), locals(), bar_name) This would work fine if people wanted to use this (once its configured) as 'import foo; foo.bar.funkyFunction()', but no! they want to be able to do 'from foo.bar import funkyFunction', which is reasonable really because I'm just as lazy as they are. So what I've come up with is def config(bar_name=DEFAULT_BAR_NAME): global bar barModule = __import__('foo.' + bar_name, globals(), locals(), bar_name) bar.faultFrom(barModule) class proxyModule(new.module): def faultFrom(self, other): self.__other = other self.__name__ = other.__name__ self.__file__ = other.__file__ + ' [proxied]' def __getattr__(self, attr): return getattr(self.__other, attr) bar = proxyModule('bar proxy') and this is the bit that feels ugly: is there a better way to have an object that can be imported from a file and that behaves like a module, but that isn't filled in until some point down the road? -- http://mail.python.org/mailman/listinfo/python-list