[Numpy-discussion] N dimensional dichotomy optimization

Sebastian Walter sebastian.walter at gmail.com
Tue Nov 23 05:13:23 EST 2010


Hello Gael,

On Tue, Nov 23, 2010 at 10:27 AM, Gael Varoquaux
<gael.varoquaux at normalesup.org> wrote:
> On Tue, Nov 23, 2010 at 10:18:50AM +0100, Matthieu Brucher wrote:
>> > The problem is that I can't tell the Nelder-Mead that the smallest jump
>> > it should attempt is .5. I can set xtol to .5, but it still attemps jumps
>> > of .001 in its initial jumps.
>
>> This is strange. It should not if the intiial points are set
>> adequatly. You may want to check if the initial conditions make the
>> optimization start at correct locations.
>
> Yes, that's excatly the problem. And it is easy to see why: in
> scipy.optimise.fmin, around line 186, the initial points are chosen with
> a relative distance of 0.00025 to the intial guess that is given. That's
> not what I want in the case of integers :).

I'm not familiar with dichotomy optimization.
Several techniques have been proposed to solve the problem: genetic
algorithms, simulated annealing, Nelder-Mead and Powell.
To be honest, I find it quite confusing that these algorithms are
named in the same breath.
Do you have a continuous or a discrete problem?

Is your problem of the following form?

min_x f(x)
s.t.   lo <= Ax + b <= up
           0 = g(x)
           0 <= h(x)

An if yes, in which space does x live?

cheers,
Sebastian











>
>> > Of course optimization on integers is
>> > fairly ill-posed, so I am asking for trouble.
>
>> Indeed :D That's why GA can be a good solution as well.
>
> It's suboptimal if I know that my function is bell-shaped.
>
> Gael
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>



More information about the NumPy-Discussion mailing list