[SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy

Benny Malengier benny.malengier at gmail.com
Sat Mar 11 13:55:33 EST 2017


Just some thoughts,

I have seen quite some methods leveraging cvodes of the sundials suite to
obtain the derivates, eg CasADi, DOTcvpSB

The odes scikit I maintain has cvode interface, but not yet cvodes, though
that should not be too hard to add. Other python interfaces to sundials
might have it.

Independent, this gives you AD, but the work is in generating the correct
equations so as to leverage cvodes.

Benny

2017-03-10 20:40 GMT+01:00 Nikolay Mayorov <nikolay.mayorov at zoho.com>:

> Hi, Antonio!
>
> I too think that moving towards more modern algorithms and their
> implementations is good for scipy. I would be happy to mentor this project,
> and most likely I will be able to.
>
> Some current thoughts and questions:
>
> 1. Have you figured what is done in SLSQP (especially "least squares"
> part)? Do you plan to use a similar approach or another approach to SQP? (I
> figured there are several somewhat different approaches.) Setting on a
> literature reference (or most likely several of them) is essential.
>
> 2. I think it is not wrong to focus on a single solver if you feel that it
> will likely take the whole time. Or maybe you can prioritize: do this first
> for sure and then have alternatives, plan a) to switch to another solver or
> plan b) to improve\add something more minor.
>
> 3. Consider whether to fit a new solver into minimize or make it as a new
> separate solver. The latter approach gives a freedom to implement things
> exactly as you want (and not to depend on old suboptimal choices) , but I
> guess it can be considered as impractical/inconsistent by some people.
> Maybe it can be decided along the way.
>
> 4. I think it is important to start to think about benchmark problems
> early, maybe even start with them. It's hard to develop a complicated
> optimization algorithm without ability to see how efficiently it works
> right away.
>
>
>
> ---- On Fri, 10 Mar 2017 23:01:32 +0500 *Matt Haberland
> <haberland at ucla.edu <haberland at ucla.edu>>* wrote ----
>
> The choice of nonlinear optimization algorithm can have a dramatic impact
> on the speed and quality of the solution, and the best choice for a
> particular problem can be difficult to determine a priori, so it is
> important to have multiple options available.
>
> My work in optimal control leads to problems with (almost entirely)
> nonlinear constraints, and the use of derivative information is essential
> for reasonable performance, leaving SLSQP as the only option in SciPy
> right now. However, the problems are also huge and very sparse with a
> specific structure, so SLSQP is not very effective, and not nearly as
> effective as a nonlinear optimization routine could be. So despite SciPy
> boasting 14 options for minimization of a nonlinear objective, it wasn't
> suitable for this work (without the use of an external solver).
>
> I think SciPy is in need of at least one solver designed to handle large,
> fully nonlinear problems, and having two would be much better. Interior
> point and SQP are good, complementary options.
>
> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com>
> wrote:
>
>
>
>
> --
> Matt Haberland
> Assistant Adjunct Professor in the Program in Computing
> Department of Mathematics
> 7620E Math Sciences Building, UCLA
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at scipy.org
> https://mail.scipy.org/mailman/listinfo/scipy-dev
>
> Hello, my name is Antonio and I am a Brazilian electrical engineer
> currently pursuing my master degree. I have contributed to scipy.optimize
>  and scipy.signal implementing functions "iirnotch", "irrpeak"
> <https://github.com/scipy/scipy/pull/6404>and the method
> "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under
> revision). I am interested in applying for the Google Summer of Code 2017
> to work with the Scipy optimisation package.
>
> My proposal is to improve scipy.optimize adding optimisation methods that
> are able to deal with non-linear constraints. Currently the only
> implemented methods able to deal with non-linear constraints are the
> FORTRAN wrappers SLSQP and COBYLA.
>
> SLSQP is a sequential quadratic programming method and COBYLA is a
> derivative-free optimisation method, they both have its limitations:
> SLSQP is not able to deal with sparse
> hessians and jacobians and is unfit for large-scale problems and COBYLA,
> as other derivative-free methods, is a good choice for optimise noisy
> objective functions, however usually presents a poorer performance then
> derivative-based methods when the derivatives are available (or even when
> they are computed by automatic differentiation or finite differences).
>
> My proposal is to implement in Scipy one or more state-of-the-art solvers
> (interior point and SQP methods) for constrained optimisation problems. I
> would like to get some feedback about this, discuss the relevance of it for
> Scipy  and get some suggestions of possible mentors.
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at scipy.org
> https://mail.scipy.org/mailman/listinfo/scipy-dev
>
>
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at scipy.org
> https://mail.scipy.org/mailman/listinfo/scipy-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20170311/ab7d50f3/attachment.html>


More information about the SciPy-Dev mailing list