[SciPy-Dev] GSoC SciPy Optimization Ideas

Mazen Sayed sayedmazen70 at gmail.com
Sun Apr 11 10:32:16 EDT 2021


Dear,

For the first point, 50 was an example, the number of points is a
hyperparameter that the user should define first, but I agree with you that
it may be useless specially with higher dimensions.
For the second point, Many thanks for the resources, I will read them
carefully.
For the third point, I agree with you that any one who studied
optimization, should normalize first, I was suggesting that we can add this
normalization feature, and the user can specify a normalize parameter to do
this step for him.

Thanks,
Mazen


On Sun, Apr 11, 2021 at 4:20 PM Andrea Gavana <andrea.gavana at gmail.com>
wrote:

> Hi,
>
> On Sun, 11 Apr 2021 at 15.09, Daniel Schmitz <
> danielschmitzsiegen at googlemail.com> wrote:
>
>> Hey Sayed,
>>
>> my two cents, not being a CoreDeveloper but a python developer interested
>> in Optimization algorithms.
>>
>> The automatic reformulation of the constrained problem into an
>> unconstrained problem sounds similar to nlopt's augmented lagrangian:
>> https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/#augmented-lagrangian-algorithm
>> . I think this would be a great addition to scipy.optimize. I imagine that
>> you would pass the reformulated objective to minimize then and just reuse
>> the existing algorithms.
>>
>> One objection to your idea about "smart initialization": why exactly 50
>> points and how exactly would they be sampled if no bounds are provided?
>> Theoretically, a grid search over samples generated by for example latin
>> hypercube sampling within a bounded volume could be a better initialization
>> than a random guess. But I am not sure that this is in many cases a good
>> idea. If you have no idea how to initialize your optimizer, I would go for
>> one of the global optimizers.
>>
>
> My 2 cents too - I’m not a SciPy developer but I am passionate about
> optimization.
>
> I tend to agree with Daniel here, randomly choosing 50 points in a
> high-dimensional optimization space is not going to give any advantage. And
> why 50?
>
> The initialization part is one of the most important (and difficult to get
> right) part of any optimization algorithm, but this is mostly true for
> global ones: differential evolution, SHGO, Dual Annealing they’re all have
> their own way. Some of these and many others (especially local algorithms)
> rely on the user to explicitly pass an initial guess and take it from there.
>
> As for the penalty approach, I do agree it would be a nice addition: you
> may want to take a look - whether for inspiration or out of curiosity - at
> the Mystic library (
> https://github.com/uqfoundation/mystic). I believe the author has covered
> most use cases in terms of penalty/barrier methods. Speaking of
> penalty/barrier approaches, you may also wish to consider whether a
> constraint violation results into a point for which the objective function
> *cannot* be evaluated or simply a point you don’t want your algorithm to go
> (but the objective function can be evaluated there).
>
> As for the normalization/denormalization concept, I think it should
> definitely be a feature of all algorithms - and honestly I would expect
> anyone knowledgeable on optimization to always apply some sort of
> normalization if the parameters values spans multiple order of magnitude.
> There’s quite a few algorithms already that apply normalization internally
> no matter what, and of course this process helps in the vast majority of
> optimization problems (why would you make your algorithm sweat in handling
> different variables spanning 10 orders of magnitude?).
>
> Andrea.
>
>
>
>
>> Best,
>>
>> Daniel
>>
>> On Sun, 11 Apr 2021 at 14:41, Mazen Sayed <sayedmazen70 at gmail.com> wrote:
>>
>>> Dear,
>>>
>>> I hope this email finds you well, this is my proposal for scipy.optimize
>>> project, I'm really interested to work on this project.
>>>
>>> Thanks
>>>
>>>
>>> https://drive.google.com/file/d/12Q6NnorN74VkuQw_HRx2kuY-FIoK90V0/view?usp=sharing
>>>
>>>
>>>
>>>
>>>
>>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=icon> Virus-free.
>>> www.avast.com
>>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=link>
>>> <#m_6208003926892838107_m_-8203804488649032660_m_-111851643062817127_m_6919920764018287639_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>>> _______________________________________________
>>> SciPy-Dev mailing list
>>> SciPy-Dev at python.org
>>> https://mail.python.org/mailman/listinfo/scipy-dev
>>>
>> _______________________________________________
>> SciPy-Dev mailing list
>> SciPy-Dev at python.org
>> https://mail.python.org/mailman/listinfo/scipy-dev
>>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.python.org/pipermail/scipy-dev/attachments/20210411/e5697e72/attachment-0001.html>


More information about the SciPy-Dev mailing list