Marco Nawijn a écrit : > Hello, > > I have a hard time figuring out an elegant and efficient design for > the following problem. > > I am working on automation of structural design problems. In the > majority of cases, this boils down to executing programs in batch in > one or more loops. The scripts to control the execution differ from > fortran to bash to python and so on. Most of them are ad hoc and what > I call 'throw away scripts'. In order to improve the situation I would > like to develop a Python module that supports the execution of > external programs. Ideally I would like to make running locally or > remote trivial for the users of the module. As an example, I would > like the following (pseudo)-code to work: > > app = Application('patran') # Run on local > machine > app.start(args) > > > app = Application('patran', host='myhost') # Run on remote machine > app.start(args) > > The problem I face is that the implementation of the application class > is completely > different for the local and remote case. The local case is a > straightforward implemenation using the subprocess module, the remote > case is a CORBA implementation. Somehow I would like to switch from > implementation class at runtime depending on whether or not the host > parameter is specified or not.
The solution is quite straightforward, and is known as the "factory" design pattern. > The Application, local implementation and remote implementation all > have the same interface, so a possibility might be something like the > following: > > class Interface(object): > ..... > def start(self): pass > def stop(self): pass What's the use of this class ? In Python, inheritance is for implementation only. > class LocalImplementation(Interface): > ..... > > class GlobalImplementation(CorbaGlobalImplementation, Interface): > ..... > > > class Application(Interface): > > def __init__(self, program, host=None): > .... > if host: > self.__impl = LocalImplementation(program) > else: > self.__impl = GlobalImplementation(program, host) > > # Forward all methods to the implementation class > def start(self): > self.__impl.start() > > def stop(self): > self.__impl.stop() > My my my... How to uselessly overcomplexify things... class LocalApp(object): def __init__(self, program): # code here def start(self): # code here def stop(self): # code here class RemoteApp(object): def __init__(self, program, host): # code here def start(self): # code here def stop(self): # code here def Application(program, host=None): if host is None: return LocalApp(program) else: return RemoteApp(program, host) > To me forwarding each call in the Application class looks a little bit > redundant Indeed !-) > and I would like to get rid of it. cf above. But in case you need to do proper delegation in Python, the magic words are "__getattr__" and "__setattr__". Here's a very basic example of using __getattr__ - using __setattr__ is a bit more tricky, but you'll find all relevant documentation in the FineManual(tm): class Wrapper(object): def __init__(self, obj): self.__obj = obj def __getattr__(self, name): return getattr(self.__obj, name) > Does anyone have any > comments or suggestions? Can metaclass programming come to rescue? May I suggest that you first learn Python bases before going into complex things ? And FWIW, googling for "KISS" might help too !-) HTH -- http://mail.python.org/mailman/listinfo/python-list