On May 4, 1:36 am, Michael <[EMAIL PROTECTED]> wrote: > On May 2, 6:08 am, Carsten Haese <[EMAIL PROTECTED]> wrote: > > > On Tue, 2007-05-01 at 22:21 -0700, Michael wrote: > > > Is there a reason for using the closure here? Using function defaults > > > seems to give better performance:[...] > > > It does? Not as far as I can measure it to any significant degree on my > > computer. > > I agree the performance gains are minimal. Using function defaults > rather than closures, however, seemed much cleaner an more explicit to > me. For example, I have been bitten by the following before: > > >>> def f(x): > > ... def g(): > ... x = x + 1 > ... return x > ... return g>>> g = f(3) > >>> g() > > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "<stdin>", line 3, in g > UnboundLocalError: local variable 'x' referenced before assignment > > If you use default arguments, this works as expected:>>> def f(x): > > ... def g(x=x): > ... x = x + 1 > ... return x > ... return g > >>> g = f(3) > >>> g() > > 4
>>> g() 4 >>> g() 4 >>> g() # what is going on here???? 4 > The fact that there also seems to be a performance gain (granted, it > is extremely slight here) led me to ask if there was any advantage to > using closures. It seems not. > > > An overriding theme in this thread is that you are greatly concerned > > with the speed of your solution rather than the structure and > > readability of your code. > > Yes, it probably does seem that way, because I am burying this code > deeply and do not want to revisit it when profiling later, but my > overriding concern is reliability and ease of use. Using function > attributes seemed the best way to achieve both goals until I found out > that the pythonic way of copying functions failed. Here was how I > wanted my code to work: > > @define_options(first_option='abs_tol') > def step(f,x,J,abs_tol=1e-12,rel_tol=1e-8,**kwargs): > """Take a step to minimize f(x) using the jacobian J. > Return (new_x,converged) where converged is true if the tolerance > has been met. > """ > <compute dx and check convergence> > return (x + dx, converged) > > @define_options(first_option='min_h') > def jacobian(f,x,min_h=1e-6,max_h=0.1): > """Compute jacobian using a step min_h < h < max_h.""" > <compute J> > return J > > class Minimizer(object): > """Object to minimize a function.""" > def __init__(self,step,jacobian,**kwargs): > self.options = step.options + jacobian.options > self.step = step > self.jacobian = jacobian > > def minimize(self,f,x0,**kwargs): > """Minimize the function f(x) starting at x0.""" > step = self.step > jacobian = self.jacobian > > step.set_options(**kwargs) > jacobian.set_options(**kwargs) > > converged = False > while not converged: > J = jacobian(f,x) > (x,converged) = step(f,x,J) > > return x > > @property > def options(self): > """List of supported options.""" > return self.options > > The idea is that one can define different functions for computing the > jacobian, step etc. that take various parameters, and then make a > custom minimizer class that can provide the user with information > about the supported options etc. > > The question is how to define the decorator define_options? > > 1) I thought the cleanest solution was to add a method f.set_options() > which would set f.func_defaults, and a list f.options for > documentation purposes. The docstring remains unmodified without any > special "wrapping", step and jacobian are still "functions" and > performance is optimal. > > 2) One could return an instance f of a class with f.__call__, > f.options and f.set_options defined. This would probably be the most > appropriate OO solution, but it makes the decorator much more messy, > or requires the user to define classes rather than simply define the > functions as above. In addition, this is at least a factor of 2.5 > timese slower on my machine than option 1) because of the class > instance overhead. (This is my only real performance concern because > this is quite a large factor. Otherwise I would just use this > method.) > > 3) I could pass generators to Minimize and construct the functions > dynamically. This would have the same performance, but would require > the user to define generators, or require the decorator to return a > generator when the user appears to be defining a function. This just > seems much less elegant. > > ... > @define_options_generator(first_option='min_h') > def jacobian_gen(f,x,min_h=1e-6,max_h=0.1): > """Compute jacobian using a step min_h < h < max_h.""" > <compute J> > return J > > class Minimizer(object): > """Object to minimize a function.""" > def __init__(self,step_gen,jacobian_gen,**kwargs): > self.options = step_gen.options + jacobian_gen.options > self.step_gen = step_gen > self.jacobian_gen = jacobian_gen > > def minimize(self,f,x0,**kwargs): > """Minimize the function f(x) starting at x0.""" > step = self.step_gen(**kwargs) > jacobian = self.jacobian_gen(**kwargs) > > converged = False > while not converged: > J = jacobian(f,x) > (x,converged) = step(f,x,J) > > return x > ... > > 4) Maybe there is a better, cleaner way to do this, but I thought that > my option 1) was the most clear, readable and fast. I would > appreciate any suggestions. The only problem is that it does use > mutable functions, and so the user might be tempted to try: > > new_step = copy(step) > > which would fail (because modifying new_step would also modify step). > I guess that this is a pretty big problem (I could provide a custom > copy function so that > > new_step = step.copy() > > would work) and I wondered if there was a better solution (or if maybe > copy.py should be fixed. Checking for a defined __copy__ method > *before* checking for pre-defined mutable types does not seem to break > anything.) > > Thanks again everyone for your suggestions, it is really helping me > learn about python idioms. > > Michael. -- http://mail.python.org/mailman/listinfo/python-list