Why are functions atomic?

Michael michael.forbes at gmail.com
Wed May 2 01:21:58 EDT 2007


> Your use case appears to be that you
> want to make multiple copies of the same function, and those copies
> should be almost, but not quite, the same.
>
> The Pythonic solution is to produce the copies by a factory function...
>
> >>> def powerfactory(exponent):
> ...    def inner(x):
> ...       return x**exponent
> ...    return inner

Is there a reason for using the closure here?  Using function defaults
seems to give better performance:
>>> def powerfactory(exponent):
...    def inner(x,exponent=exponent):
...       return x**exponent
...    return inner

This is definitely one viable solution and is essentially what I had
in mind, but I did not want to have to carry the generator arround
with me: Instead, I wanted to use it once as a decorator and then
carry only the function around.

>>> @declare_options(first_option='opt1')
>>> def f(x,opt1,opt2,opt3):
...     return x*(opt1+opt2*opt3)
>>> f.set_options(opt1=1,opt2=2,opt3=3)
>>> f(1)
7
>>> from copy import copy
>>> g = copy(f)
>>> g.set_options(opt1=4,opt2=5,opt3=6)
>>> f(1)
7
>>> g(1)
34

The decorator declare_options behaves like the generator above, but
adds some methods (set_options) etc. to allow me to manipulate the
options without generating a new function each time.

I have functions with many options that may be called in the core of
loops, and found that the most efficient solution was to provide all
of the options through func_defaults.

>>> def f(x,opt1,opt2,opt3):
...     return x*(opt1 + opt2*opt3)

The cleanest (and fastest) solution I found was to set the options in
the defaults:
>>> f.func_defaults = (1,2,3)

Then f can be passed to the inner loops and f(x) is very quick.

Other options include using lists and dict's:
>>> opt = (1,2,3)
>>> f(1,*opt)
7

but then I have to pass f and opt around.  This also appears to be
somewhat slower than the defaults method.  Dictionaries have the
advantage of associating the names with the values

>>> opt = {'opt1':1, 'opt2':2, 'opt3':3}
>>> f(1,**opt)
7

but this is much slower.  Wrapping the function with a generator as
you suggest also works and packages everything together, but again
suffers in performance.  It also complicates my code.

The result of my declare_options decorator is that the result is a
regular function, complete with docstring etc. but with added
annotations that allow the options to be set.  In addition, the
performance optimal.  I though this was a very clean solution until I
realized that I could not make copies of the functions to allow for
different option values with the usual python copy symantics (for
example, a __copy__ method is ignored).  I can easily get around this
by adding a custom copy() method, but wondered if there was anything
inherently dangerous with this approach that would justify the added
complications of more complicated wrappings and the performance hit.

Pickling is an obvious issue, but it seems like there is nothing wrong
with the copy semantics and that the limitations are artificial and
out of place.  (It is also easily fixed: if the object has a __copy__
method, use it.  Truely immutable objects will never have one.  There
may be subtle issues here, but I don't know what they are.)

Thanks for all of the suggestions,
Michael.




More information about the Python-list mailing list