Optimization and DOE framework¶
In this section we describe GEMSEO’s optimization and DOE framework.
MDO formulation and optimization problem developers should also understand this part of GEMSEO.
Setting up an
from gemseo. api import create_design_space from numpy import ones design_space = create_design_space() design_space.add_variable("x", 1, l_b=-2., u_b=2., value=-0.5 * np.ones(1))
from gemseo.algos import MDOFunction f_1 = MDOFunction(np.sin, name="f_1", jac=np.cos, expr="sin(x)") f_2 = MDOFunction(np.exp, name="f_2", jac=np.exp, expr="exp(x)") f_1_sub_f_2 = f_1 - f_2
from gemseo.algos import OptimizationProblem, MDOFunction, problem = OptimizationProblem(design_space)
problem.objective = f_1_sub_f_2
OptimizationProblem.constraints attribute must be set with a list of inequality or equality constraints.
MDOFunction.f_type attribute of
MDOFunction shall be set to
"ineq" to declare the type of constraint to equality or inequality.
All inequality contraints must be negative by convention, whatever the optimization algorithm used to solve the problem.
Solving the problem by optimization¶
Once the optimization problem created, it can be solved using one of the available
optimization algorithms from the
by means of the function
whose mandatory arguments are the
and the optimization algorithm name. For example, in the case of the L-BFGS-B algorithm
with normalized design space, we have:
from gemseo.algos import OptimizersFactory opt = OptimizersFactory().execute(problem, "L-BFGS-B", normalize_design_space=True) print("Optimum = " + str(opt))
The list of available algorithms depend on the local setup of GEMSEO, and the installed optimization libraries. It can be obtained using :
algo_list = OptimizersFactory().algorithms print("Available algorithms:" +str(algo_list))
The optimization history can be saved to the disk for further analysis,
without having to re execute the optimization.
For that, we use the function
Solving the problem by DOE¶
DOE algorithms can also be used to sample the design space and observe the value of the objective and constraints
from gemseo.algos import DOEFactory # And solve it with |g| interface opt = DOEFactory().execute(problem, "lhs", n_samples=10, normalize_design_space=True)
The optimization history can be plotted using one of the post processing tools, see the post-processing page.
from gemseo.api import execute_post execute_post(problem, "OptHistoryView", save=True, file_path="simple_opt") # Also works from disk execute_post("my_optim.hdf5", "OptHistoryView", save=True, file_path="opt_view_from_disk")