[SciPy-Dev] ?==?utf-8?q? Stochastic branch and bound optimization algorithm

ROSA Benoit b.rosa at unistra.fr
Fri Jul 21 09:41:30 EDT 2017


Hi,

Your reply comes just at the right time, since I had a bit of time to work on this today :)

I tried to setup the benchmarks for this too, but I get some problems when running them. 

Long story short, the benchmark fails to run, throwing an error from the ASV package (see https://pastebin.com/0tnN2hQE for exact details). I don't have any experience with this, so it's quite hard for me to debug it. More interestingly, it also fails on a fresh clone from the main scipy repository. 

What I tried: 

python runtests.py --bench optimize.BenchGlobal --> fails on both my modified version (to add stochasticBB testing) and on the original scipy repository (errors described in the pastebin above)

python runtests.py --bench optimize.BenchLeastSquares --> works flawlessly 

python runtests.py --bench optimize.BenchSmoothUnbounded --> works flawlessly

Is there something I missed, or the benchmarking pipeline for the global optimization algorithms is broken at the moment ? 

Best,
Benoit

On Friday, July 21, 2017 06:16 CEST, Ralf Gommers <ralf.gommers at gmail.com> wrote: 
 
> Hi Benoit,
> 
> Your email fell through the cracks, sorry for the slow reply.
> 
> On Sat, Jun 10, 2017 at 2:46 AM, Benoit Rosa <b.rosa at unistra.fr> wrote:
> 
> > Hi all,
> >
> > I recently implemented a stochastic branch and bound algorithm in python
> > for one of my projects (namely, 3D registration without an initial guess,
> > requiring global optimization over the SE3 group).
> >
> > After trying basinhopping without much success, I went for a modified
> > stochastic branch and bound algorithm, described in those two papers
> > (disclaimer: I'm an author in one of those)
> > [1] C. Papazov and D. Burschka, Stochastic global optimization for robust
> > point set registration, Comput. Vis. Image Understand. 115 (12) (2011)
> > 1598-1609
> > [2] C. Gruijthuijsen, B. Rosa, P.T. Tran, J. Vander Sloten, E. Vander
> > Poorten, D. Reynaerts, An automatic registration method for radiation-free
> > catheter navigation guidance, J. Med. Robot Res. 1 (03) (2016), 1640009
> >
> > Since I wanted to compare with other optimization algorithms, I
> > implemented things by modifying the basinhopping file, and as such my
> > implementation (git commit here: https://github.com/benoitrosa/
> > scipy/commit/49a2c23b74b69dc4250e20e21db75bd071dfd92d ) is fully
> > compatible with Scipy already. I have added a bit of documentation in the
> > file too. If you want an idea, on my project (for which I can't share the
> > code now), this algorithm finds the global optimum over the S03 space (i.e.
> > rotations) in 4.5 seconds on average, where basinhopping takes more than 15
> > seconds, and doesn't necessarily converge to the correct solution.
> >
> 
> That sounds promising. It also sounds quite specific though, so my first
> questions would be how performance on a wider range of problems looks like.
> For scipy.optimize there's an extensive set of benchmarks (in the repo
> under benchmarks/), new optimizers should be added to that and should
> perform better than what's already present in scipy.
> 
> 
> >
> > More about the algorithm itself: it's a common branch and bound algorithm
> > (i.e. recursive traversal of a tree and bisecting of leaves to find an
> > optimum), with two additions:
> >
> >     1 - The tree is traversed stochastically. This is governed by a
> > temperature parameter, much like simulated annealing algorithms. At each
> > node, there is a probability of going towards a "less promising"
> > (understand: cost function higher) branch, this probability being governed
> > by the temperature parameter. In the beginning, "bad" branches will be
> > selected, ensuring an exploration of large portions of the space. As the
> > number of iterations increases the temperature decreases, and the
> > algorithms goes more towards the most promising branches/leaves, getting
> > higher resolution in that part of the space.
> >
> >     2 - When creating a new leaf and evaluating the cost function there,
> > the value is propagated down the tree, until reaching the root or a node
> > with a lower cost. This ensures a smart traversal of the tree. Moreover, it
> > also makes sure that at any time, the root node has the lowest cost, making
> > it easy to query for the best solution when it is interrupted.
> >
> > Another advantage of this algorithm is that parameter tuning is pretty
> > easy. It only needs the bounds of the search space (no initial guess), a
> > sampling function (default is uniform) and a cost function (obviously).
> > Additional parameters are the number of iterations, and the start and stop
> > temperatures (start should be high to accept many "bad solutions" in the
> > beginning, and stop should be tending to zero).
> >
> > If that algorithm is of interest to the SciPy community, I'd be happy to
> > clean up the code and make a pull request !
> >
> 
>  In principle there's interest, if it brings either better performance than
> the optimizers we have now, or if it solves a class of problems that are
> not handled well by existing solvers.
> 
> Cheers,
> Ralf
 
 
 
 


More information about the SciPy-Dev mailing list