gemseo.algos.optimization_problem module#
Optimization problem.
The OptimizationProblem
class operates on a DesignSpace
defining:
an initial guess \(x_0\) for the design variables,
the bounds \(l_b \leq x \leq u_b\) of the design variables.
A (possible vector) objective function with an MDOFunction
type
is set using the objective
attribute.
If the optimization problem looks for the maximum of this objective function,
the OptimizationProblem.minimize_objective()
property
changes the objective function sign
because the optimization drivers seek to minimize this objective function.
Equality and inequality constraints are also MDOFunction
instances
provided to the OptimizationProblem
by means of its OptimizationProblem.add_constraint()
method.
The OptimizationProblem
allows to evaluate the different functions
for a given design parameters vector
(see OptimizationProblem.evaluate_functions()
).
Note that this evaluation step relies on an automated scaling of function wrt the bounds
so that optimizers and DOE algorithms work
with inputs scaled between 0 and 1 for all the variables.
The OptimizationProblem
has also a Database
that stores the calls to all the functions
so that no function is called twice with the same inputs.
Concerning the derivatives' computation,
the OptimizationProblem
automates
the generation of the finite differences or complex step wrappers on functions,
when the analytical gradient is not available.
Lastly,
various getters and setters are available,
as well as methods to export the Database
to an HDF file or to a Dataset
for future post-processing.
- class OptimizationProblem(design_space, is_linear=True, database=None, differentiation_method=DifferentiationMethod.USER_GRAD, differentiation_step=1e-07, parallel_differentiation=False, use_standardized_objective=True, **parallel_differentiation_options)[source]#
Bases:
EvaluationProblem
An optimization problem.
- Parameters:
pb_type -- The type of the optimization problem.
use_standardized_objective (bool) --
Whether to use standardized objective for logging and post-processing.
By default it is set to True.
design_space (DesignSpace)
is_linear (bool) --
By default it is set to True.
database (Database | None)
differentiation_method (DifferentiationMethod) --
By default it is set to "user".
differentiation_step (float) --
By default it is set to 1e-07.
parallel_differentiation (bool) --
By default it is set to False.
- AggregationFunction#
alias of
EvaluationFunction
- class DifferentiationMethod(value)#
Bases:
StrEnum
The differentiation methods.
- CENTERED_DIFFERENCES = 'centered_differences'#
- COMPLEX_STEP = 'complex_step'#
- FINITE_DIFFERENCES = 'finite_differences'#
- NO_DERIVATIVE = 'no_derivative'#
- USER_GRAD = 'user'#
- class HistoryFileFormat(value)[source]#
Bases:
StrEnum
The format of the history file.
- GGOBI = 'ggobi'#
- HDF5 = 'hdf5'#
- add_constraint(function, value=0.0, constraint_type=None, positive=False)[source]#
Add an equality or inequality constraint to the optimization problem.
An equality constraint is written as \(c(x)=a\), a positive inequality constraint is written as \(c(x)\geq a\) and a negative inequality constraint is written as \(c(x)\leq a\).
- Parameters:
function (MDOFunction) -- The function \(c\).
value (float) --
The value \(a\).
By default it is set to 0.0.
constraint_type (MDOFunction.ConstraintType | None) -- The type of the constraint.
positive (bool) --
Whether the inequality constraint is positive.
By default it is set to False.
- Raises:
TypeError -- When the constraint of a linear optimization problem is not an
MDOLinearFunction
.ValueError -- When the type of the constraint is missing.
- Return type:
None
- apply_exterior_penalty(objective_scale=1.0, scale_inequality=1.0, scale_equality=1.0)[source]#
Reformulate the optimization problem using exterior penalty.
Given the optimization problem with equality and inequality constraints:
\[ \begin{align}\begin{aligned}min_x f(x)\\s.t.\\g(x)\leq 0\\h(x)=0\\l_b\leq x\leq u_b\end{aligned}\end{align} \]The exterior penalty approach consists in building a penalized objective function that takes into account constraints violations:
\[ \begin{align}\begin{aligned}min_x \tilde{f}(x) = \frac{f(x)}{o_s} + s[\sum{H(g(x))g(x)^2}+\sum{h(x)^2}]\\s.t.\\l_b\leq x\leq u_b\end{aligned}\end{align} \]Where \(H(x)\) is the Heaviside function, \(o_s\) is the
objective_scale
parameter and \(s\) is the scale parameter. The solution of the new problem approximate the one of the original problem. Increasing the values ofobjective_scale
and scale, the solutions are closer but the optimization problem requires more and more iterations to be solved.- Parameters:
scale_equality (float | RealArray) --
The equality constraint scaling constant.
By default it is set to 1.0.
objective_scale (float) --
The objective scaling constant.
By default it is set to 1.0.
scale_inequality (float | RealArray) --
The inequality constraint scaling constant.
By default it is set to 1.0.
- Return type:
None
- check()[source]#
Check if the optimization problem is ready for run.
- Raises:
ValueError -- If the objective function is missing.
- Return type:
None
- classmethod from_hdf(file_path, x_tolerance=0.0, hdf_node_path='')[source]#
Import an optimization history from an HDF file.
- Parameters:
file_path (str | Path) -- The file containing the optimization history.
x_tolerance (float) --
The tolerance on the design variables when reading the file.
By default it is set to 0.0.
hdf_node_path (str) --
The path of the HDF node from which the database should be imported. If empty, the root node is considered.
By default it is set to "".
- Returns:
The read optimization problem.
- Return type:
- get_function_dimension(name)[source]#
Return the dimension of a function of the problem (e.g. a constraint).
- Parameters:
name (str) -- The name of the function.
- Returns:
The dimension of the function.
- Raises:
ValueError -- If the function name is unknown to the problem.
RuntimeError -- If the function dimension is not available.
- Return type:
- get_functions(no_db_no_norm=False, observable_names=(), jacobian_names=None, evaluate_objective=True, constraint_names=())[source]#
- Parameters:
evaluate_objective (bool) --
Whether to evaluate the objective.
By default it is set to True.
constraint_names (Iterable[str] | None) --
The names of the constraints to evaluate. If empty, then all the constraints are returned. If
None
, then no constraint is returned.By default it is set to ().
no_db_no_norm (bool) --
By default it is set to False.
observable_names (Iterable[str] | None) --
By default it is set to ().
jacobian_names (Iterable[str] | None)
- Return type:
- get_functions_dimensions(names=None)[source]#
Return the dimensions of the outputs of the problem functions.
- get_reformulated_problem_with_slack_variables()[source]#
Add slack variables and replace inequality constraints with equality ones.
Given the original optimization problem,
\[ \begin{align}\begin{aligned}min_x f(x)\\s.t.\\g(x)\leq 0\\h(x)=0\\l_b\leq x\leq u_b\end{aligned}\end{align} \]Slack variables are introduced for all inequality constraints that are non-positive. An equality constraint for each slack variable is then defined.
\[ \begin{align}\begin{aligned}min_{x,s} F(x,s) = f(x)\\s.t.\\H(x,s) = h(x)=0\\G(x,s) = g(x)-s=0\\l_b\leq x\leq u_b\\s\leq 0\end{aligned}\end{align} \]- Returns:
An optimization problem without inequality constraints.
- Return type:
- reset(database=True, current_iter=True, design_space=True, function_calls=True, preprocessing=True)[source]#
Partially or fully reset the problem.
- Parameters:
database (bool) --
Whether to clear the database.
By default it is set to True.
current_iter (bool) --
Whether to reset the counter of evaluations to the initial iteration.
By default it is set to True.
design_space (bool) --
Whether to reset the current value of the design space which can be
None
.By default it is set to True.
function_calls (bool) --
Whether to reset the number of calls of the functions.
By default it is set to True.
preprocessing (bool) --
Whether to turn the pre-processing of functions to False.
By default it is set to True.
- Return type:
None
- to_dataset(name: str = '', categorize: Literal[True] = True, export_gradients: bool = False, input_values: Iterable[RealArray] = (), opt_naming: Literal[False] = True) IODataset [source]#
- to_dataset(name: str = '', categorize: Literal[True] = True, export_gradients: bool = False, input_values: Iterable[RealArray] = (), opt_naming: Literal[True] = True) OptimizationDataset
- Parameters:
categorize -- Whether to distinguish between the different groups of variables.
opt_naming -- Whether to put the design variables in the
OptimizationDataset.DESIGN_GROUP
and the functions and their derivatives in theOptimizationDataset.FUNCTION_GROUP
. Otherwise, put the design variables in theIODataset.INPUT_GROUP
and the functions and their derivatives in theIODataset.OUTPUT_GROUP
.
- to_hdf(file_path, append=False, hdf_node_path='')[source]#
Export the optimization problem to an HDF file.
- Parameters:
file_path (str | Path) -- The HDF file path.
append (bool) --
Whether to append the data to the file if not empty. Otherwise, overwrite data.
By default it is set to False.
hdf_node_path (str) --
The path of the HDF node in which the optimization problem should be exported. If empty, the root node is considered.
By default it is set to "".
- Return type:
None
- property constraints: Constraints#
The constraints.
- property functions: list[MDOFunction]#
All the functions except
new_iter_observables
.
- property is_mono_objective: bool#
Whether the optimization problem is mono-objective.
- Raises:
ValueError -- When the dimension of the objective cannot be determined.
- property objective: MDOFunction#
The objective function.
- property optimum: Solution#
The optimum solution within a given feasibility tolerance.
This solution is defined by:
the value of the objective function,
the value of the design variables,
the indicator of feasibility of the optimal solution,
the value of the constraints,
the value of the gradients of the constraints.
- property original_functions: list[MDOFunction]#
All the original functions except those of
new_iter_observables
.
- property scalar_constraint_names: list[str]#
The names of the scalar constraints.
A scalar constraint is a constraint whose output is of dimension 1.
- solution: OptimizationResult | None#
The solution of the optimization problem if solved; otherwise
None
.
- property standardized_objective_name: str#
The name of the standardized objective.
Given an objective named
"f"
, the name of the standardized objective is"f"
in the case of minimization and "-f" in the case of maximization.
- property tolerances: ConstraintTolerances#
The constraint tolerances.
- use_standardized_objective: bool#
Whether to use standardized objective for logging and post-processing.
The standardized objective corresponds to the original one expressed as a cost function to minimize. A
BaseDriverLibrary
works with this standardized objective and theDatabase
stores its values. However, for convenience, it may be more relevant to log the expression and the values of the original objective.