[SciPy-Dev] Proposal - optimize callback given a state keyword

Scott Sievert sievert.scott at gmail.com
Thu Feb 8 12:19:56 EST 2018


> we exposed some form of Optimizer object, that would know how to perform
a single optimization step


I think that's a wonderful idea, and one that has a lot of historical
precedent

Another +1, and I’d be happy to help implement this. More precedent is that
popular deep learning libraries implement their optimizers with
classes (PyTorch
source <http://pytorch.org/docs/0.3.0/optim.html>, Tensorflow source
<https://www.tensorflow.org/versions/r0.12/api_docs/python/train/optimizers>
).

>From where I sit this is hugely beneficial because it allows interacting
with an optimization as an object. I’m involved in optimization research,
and I often want to compute some value after an optimization step or modify
the step size at every iteration or something similar.

I also desire this in the scipy.optimize interface, and have moved away
from scipy.optimize. I started my own implementation then moved to PyTorch.
Scott

On February 8, 2018 at 8:13:58 AM, Joshua Wilson (
josh.craig.wilson at gmail.com) wrote:

> we exposed some form of Optimizer object, that would know how to perform
a single optimization step

I think that's a wonderful idea, and one that has a lot of historical
precedent since it's generally how Fortran optimizers worked because
there were no function pointers. (Which conveniently means a lot of
things in SciPy work that way under-the-hood already.)

Just a random comment from the peanut gallery,
Josh

On Thu, Feb 8, 2018 at 6:37 AM, Jaime Fernández del Río
<jaime.frio at gmail.com> wrote:
> On Thu, Feb 8, 2018 at 3:06 AM Andrew Nelson <andyfaff at gmail.com> wrote:
>>
>> Hi all,
>> I propose that the optimize callbacks be asked to accept a new state
>> keyword in the future.
>> The state keyword will be an `OptimizeResult` giving the state of the
>> optimizer at that point.
>> i.e. n_iterations, n_fun, (n_jac?), x, fun, etc.
>>
>> To aid the transition I propose that the user callbacks are wrapped with
>> something along the lines of:
>>
>> class OptimizerCallback(object):
>> def __init__(self, callback):
>> self.callback = callback
>> argspec = _getargspec(callback)
>> self.accepts_state = True
>> if 'state' not in argspec.varkw:
>> warnings.warn(FutureWarning, "In future versions of scipy the"
>> " optimizer
>> callback should accept"
>> "a state
>> kwd")
>> self.accepts_state = False
>>
>> def __call__(self, x, state):
>> if self.accepts_state:
>> self.callback(x, state)
>> else:
>> self.callback(x)
>>
>> and the optimizers be modified appropriately.
>>
>> Thoughts, comments?
>
> If we do implement callback on steroids, I'd also suggest:
>
> Letting that class wrap a None (which would make calling it a no-op) would
> allow us to get rid of all those `if callback is not None:` in the
optimizer
> loops.
> Another interesting addition for such an OptimizeResult would be to make
the
> typical stopping conditions (gradient norm, relative function change...)
> available.
> This would be especially useful if we allowed the callback to terminate
the
> optimization early, e.g. by returning False instead of None.
>
> While I think all of this would be a step in the right direction, I wonder
> if cramming more functionality into the callback is the way to go. After a
> few months of using optimize A LOT at work, I am pretty much convinced
that
> the architecture of the library is,, let's call it suboptimal... I haven't
> fully fleshed it out, but I think it would be much better if, instead of
> black box functions that run through to convergence, with only the
callback
> function to shed light on what may be going on, we exposed some form of
> Optimizer object, that would know how to perform a single optimization
step.
> There should still be a form of automatically running these to
convergence,
> but it would allow advanced users to do things that now are impossible,
like
> e.g. running two different algorithms on the same problem with the exact
> same termination conditions.
>
> Jaime
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
_______________________________________________
SciPy-Dev mailing list
SciPy-Dev at python.org
https://mail.python.org/mailman/listinfo/scipy-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20180208/bf5d3e88/attachment.html>


More information about the SciPy-Dev mailing list