Note
Go to the end to download the full example code.
Analytical test case # 1#
In this example, we consider a simple optimization problem to illustrate
algorithms interfaces and MDOFunction
.
Imports#
from __future__ import annotations
from numpy import cos
from numpy import exp
from numpy import ones
from numpy import sin
from scipy import optimize
from gemseo import configure_logger
from gemseo.core.mdo_functions.mdo_function import MDOFunction
configure_logger()
<RootLogger root (INFO)>
Define the objective function#
We define the objective function \(f(x)=\sin(x)-\exp(x)\)
using an MDOFunction
defined by the sum of MDOFunction
s.
f_1 = MDOFunction(sin, name="f_1", jac=cos, expr="sin(x)")
f_2 = MDOFunction(exp, name="f_2", jac=exp, expr="exp(x)")
objective = f_1 - f_2
See also
The following operators are implemented: \(+\), \(-\) and \(*\). The minus operator is also defined.
objective
[f_1-f_2] = sin(x)-exp(x)
Minimize the objective function#
We want to minimize this objective function over \([-2,2]\), starting from 1. We use scipy.optimize for illustration.
Note
MDOFunction
objects are callable like a Python function.
x_0 = -ones(1)
opt = optimize.fmin_l_bfgs_b(
objective.evaluate, x_0, fprime=objective.jac, bounds=[(-0.2, 2.0)]
)
opt
(array([-0.2]), -1.017400083873043, {'grad': array([0.16133582]), 'task': 'CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL', 'funcalls': 1, 'nit': 0, 'warnflag': 0})
Total running time of the script: (0 minutes 0.003 seconds)