[Numpy-discussion] N dimensional dichotomy optimization

josef.pktd at gmail.com josef.pktd at gmail.com
Tue Nov 23 10:19:06 EST 2010


On Tue, Nov 23, 2010 at 8:50 AM, Gael Varoquaux
<gael.varoquaux at normalesup.org> wrote:
> On Tue, Nov 23, 2010 at 02:47:10PM +0100, Sebastian Walter wrote:
>> Well, I don't know what the best method is to solve your problem, so
>> take the following with a grain of salt:
>> Wouldn't it be better to change the model than modifying the
>> optimization algorithm?
>
> In this case, that's not possible. You can think of this parameter as the
> number of components in a PCA (it's actually a more complex dictionnary
> learning framework), so it's a parameter that is discrete, and I can't do
> anything about it :).
>
>> It sounds as if the resulting objective function is piecewise
>> constant.
>
>> AFAIK most optimization algorithms for continuous problems require at
>> least Lipschitz continuous functions to work ''acceptable well''. Not
>> sure if this is also true for Nelder-Mead.
>
> Yes correct. We do have a problem.
>
> I have a Nelder-Mead that seems to be working quite well on a few toy
> problems.

Assuming your function is well behaved, one possible idea is to try
replacing the integer objective function with a continuous
interpolation. Or maybe fit a bellshaped curve to a few gridpoints. It
might get you faster into the right neighborhood to do an exact
search.

(There are some similar methods of using a surrogate objective
function, when it is very expensive or impossible to calculate an
objective function, but I never looked closely at these cases.)

Josef

>
> Gaël
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>



More information about the NumPy-Discussion mailing list