[SciPy-Dev] Constrained Optimization with Scipy
Masoud Mansouryar
masoud.mansouryar at yahoo.com
Tue Sep 19 06:37:12 EDT 2017
Hi Friends,
I am working on an Optimization problem in Python, which is defined like this:
def objective(x, sign=1.0):
P_max = x[0]
P_min = x[1]
s_max = x[2]
s_min = x[3]
s_data, P_f, P_g = PG(a,b,P_max,P_min,P_f_max,s_max,s_min)
A_tot = AM(s_data,c,d)
a_cost_opt = a_costFunc(A_tot)
g_cost_opt = g_costFunc(P_f,P_g)
res_max = -(a_cost_opt + g_cost_opt)
print((a_cost_opt + g_cost_opt))
return res_max
x0 = (2.5,2.5,.8,.2) # Starting points
bd0 = (1,5)
bd1 = (1,5)
bd2 = (0.7,0.9)
bd3 = (0.1,0.3)
bnds = (bd0,bd1,bd2,bd3) # bounds
minimizer_kwargs = dict(method="L-BFGS-B", bounds=bnds)
sol = sci.optimize.basinhopping(objective, x0, minimizer_kwargs=minimizer_kwargs, niter=2, niter_success=5, T=0.5, stepsize=.1, disp=True)
As you see in my code, due to the complexity of my objective function, I am using basinhopping algorithm and so far it is giving me a good result. That negative sign in my res_max is to find the maximum indeed (maximum interest), so I thought a negative minimum is a maximum.
The functions which are being called inside my objective function, have non-linear calculations, and all of their parameters are numpy array.
But there are some points unclear for me about this optimization:
1. Is my objective function convex, differentiable, non-linear, discrete?
2. How can I find that my result is a global minimum, not a local minimum? (I want the global minimum)
3. Which algorithm is actually minimizing my function, basinhopping or L-BFGS-B?
4. Is scipy.optimize.basinhopping the best framework which I need?
Your help is really appreciated.
Thank you,
Masoud
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20170919/6d7edd75/attachment.html>
More information about the SciPy-Dev
mailing list