mnbi module¶
Modified Normal Boundary Intersection (mNBI) algorithm.
Based on [Shu07].
- class gemseo.algos.opt.mnbi.BetaSubOptimOutput(f_min, x_min, w, database, n_calls)[source]¶
Bases:
NamedTuple
An output from a beta sub optimization.
Create new instance of BetaSubOptimOutput(f_min, x_min, w, database, n_calls)
- Parameters:
- count(value, /)¶
Return number of occurrences of value.
- index(value, start=0, stop=9223372036854775807, /)¶
Return first index of value.
Raises ValueError if the value is not present.
- f_min: RealArray¶
The coordinates in the objective space of the sub-optimization result.
- n_calls: int¶
The number of calls to the main objective function by the optimizer during the sub-optimization.
- w: RealArray¶
The vector w used to compute the values of beta that can be skipped in the following sub-optimizations.
- x_min: RealArray¶
The coordinates in the design space of the sub-optimization result.
- class gemseo.algos.opt.mnbi.IndividualSubOptimOutput(f_min, x_min, database, n_calls)[source]¶
Bases:
NamedTuple
An output from a sub optimization.
Create new instance of IndividualSubOptimOutput(f_min, x_min, database, n_calls)
- count(value, /)¶
Return number of occurrences of value.
- index(value, start=0, stop=9223372036854775807, /)¶
Return first index of value.
Raises ValueError if the value is not present.
- f_min: RealArray¶
The value of f at the design value minimizing f_i.
- x_min: RealArray¶
The value of the design variables minimizing f_i.
- class gemseo.algos.opt.mnbi.MNBI[source]¶
Bases:
OptimizationLibrary
MNBI optimizer.
This algorithm computes the Pareto front of a multi-objective optimization problem by decomposing it into a series of constrained single-objective problems. Considering the following problem:
\[\begin{split}\begin{align} & \min_{x \in D} f(x),\\ & g(x) \leq 0,\\ & h(x) = 0 \end{align}\end{split}\]the algorithm first finds the individual optima \((x_i^\ast)_{i=1..m}\) of the \(m\) components of the objective function \(f\). The corresponding anchor points \((f(x_i^\ast))_{i=1..m}\) are stored in a matrix \(\Phi\).
The simplex formed by the convex hull of the anchor points can be expressed as \(\Phi \beta\), where \(\beta = \{ (b_1, ..., b_m)^T | \sum_{i=1}^m b_i =1 \}\).
Given a list of vectors \(\beta\), mNBI will solve the following single-objective problems:
\[\begin{split}\begin{align} & \max_{x \in D, t \in \mathbb{R}} t,\\ & \Phi \beta + t \hat{n} \geq f(x),\\ & g(x) \leq 0,\\ & h(x) = 0 \end{align}\end{split}\]where \(\hat{n}\) is a quasi-normal vector to the \(\Phi \beta\) simplex pointing towards the origin of the objective space. If \((x^{*}, t^{*})\) is a solution of this problem, \(x^{*}\) is proven to be at least weakly Pareto-dominant.
Let \(w = \Phi \beta + t^{*} \hat{n}\), and \(\pi\) denote the projection (in the direction of \(\hat{n}\)) on the simplex formed by the convex hull of the anchor points. If not all constraints \(\Phi \beta + t^{*} \hat{n} \geq f(x^{*})\) are active, \(x^{*}\) will weakly dominate the solution of the sub-problem for all values \(\beta_{dom}\) that verify:
\[\Phi \beta_{dom} \in \pi[(f(x^{*}) + \mathbb{R}_m^{+}) \cap (w - \mathbb{R}_m^{+})]\]Therefore, the corresponding sub-optimizations are redundant and can be skipped to reduce run time.
Notes
The missing current values of the
DesignSpace
attached to theOptimizationProblem
are automatically initialized with the methodDesignSpace.initialize_missing_current_values()
.- class ApproximationMode(value)¶
Bases:
StrEnum
The approximation derivation modes.
- CENTERED_DIFFERENCES = 'centered_differences'¶
The centered differences method used to approximate the Jacobians by perturbing each variable with a small real number.
- COMPLEX_STEP = 'complex_step'¶
The complex step method used to approximate the Jacobians by perturbing each variable with a small complex number.
- FINITE_DIFFERENCES = 'finite_differences'¶
The finite differences method used to approximate the Jacobians by perturbing each variable with a small real number.
- class DifferentiationMethod(value)¶
Bases:
StrEnum
The differentiation methods.
- CENTERED_DIFFERENCES = 'centered_differences'¶
- COMPLEX_STEP = 'complex_step'¶
- FINITE_DIFFERENCES = 'finite_differences'¶
- USER_GRAD = 'user'¶
- algorithm_handles_eqcstr(algo_name)¶
Check if an algorithm handles equality constraints.
- algorithm_handles_ineqcstr(algo_name)¶
Check if an algorithm handles inequality constraints.
- deactivate_progress_bar()¶
Deactivate the progress bar.
- Return type:
None
- driver_has_option(option_name)¶
Check the existence of an option.
- ensure_bounds(orig_func, normalize=True)¶
Project the design vector onto the design space before execution.
- Parameters:
orig_func – The original function.
normalize (bool) –
Whether to use the normalized design space.
By default it is set to True.
- Returns:
A function calling the original function with the input data projected onto the design space.
- execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, **options)¶
Execute the driver.
- Parameters:
problem (OptimizationProblem) – The problem to be solved.
algo_name (str | None) – The name of the algorithm. If
None
, use the algo_name attribute which may have been set by the factory.eval_obs_jac (bool) –
Whether to evaluate the Jacobian of the observables.
By default it is set to False.
skip_int_check (bool) –
Whether to skip the integer variable handling check of the selected algorithm.
By default it is set to False.
**options (DriverLibOptionType) – The options for the algorithm.
- Returns:
The optimization result.
- Raises:
ValueError – If algo_name was not either set by the factory or given as an argument.
- Return type:
- filter_adapted_algorithms(problem)¶
Filter the algorithms capable of solving the problem.
- finalize_iter_observer()¶
Finalize the iteration observer.
- Return type:
None
- get_optimum_from_database(message=None, status=None)¶
Return the optimization result from the database.
- Return type:
- get_right_sign_constraints()¶
Transform the problem constraints into their opposite sign counterpart.
This is done if the algorithm requires positive constraints.
- get_x0_and_bounds_vects(normalize_ds, as_dict=False)¶
Return the initial design variable values and their lower and upper bounds.
- init_iter_observer(max_iter, message='')¶
Initialize the iteration observer.
It will handle the stopping criterion and the logging of the progress bar.
- Parameters:
- Raises:
ValueError – If
max_iter
is lower than one.- Return type:
None
- init_options_grammar(algo_name)¶
Initialize the options’ grammar.
- Parameters:
algo_name (str) – The name of the algorithm.
- Return type:
- is_algo_requires_positive_cstr(algo_name)¶
Check if an algorithm requires positive constraints.
- classmethod is_algorithm_suited(algorithm_description, problem)¶
Check if an algorithm is suited to a problem according to its description.
- Parameters:
algorithm_description (AlgorithmDescription) – The description of the algorithm.
problem (Any) – The problem to be solved.
- Returns:
Whether the algorithm is suited to the problem.
- Return type:
- new_iteration_callback(x_vect)¶
Verify the design variable and objective value stopping criteria.
- Parameters:
x_vect (ndarray) – The design variables values.
- Raises:
FtolReached – If the defined relative or absolute function tolerance is reached.
XtolReached – If the defined relative or absolute x tolerance is reached.
- Return type:
None
- requires_gradient(driver_name)¶
Check if a driver requires the gradient.
- EQ_TOLERANCE = 'eq_tolerance'¶
- EVAL_OBS_JAC_OPTION = 'eval_obs_jac'¶
- F_TOL_ABS = 'ftol_abs'¶
- F_TOL_REL = 'ftol_rel'¶
- INEQ_TOLERANCE = 'ineq_tolerance'¶
- LS_STEP_NB_MAX = 'max_ls_step_nb'¶
- LS_STEP_SIZE_MAX = 'max_ls_step_size'¶
- MAX_DS_SIZE_PRINT = 40¶
- MAX_FUN_EVAL = 'max_fun_eval'¶
- MAX_ITER = 'max_iter'¶
- MAX_TIME = 'max_time'¶
- NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'¶
- OPTIONS_DIR: ClassVar[str | Path] = 'options'¶
The name of the directory containing the files of the grammars of the options.
- OPTIONS_MAP: ClassVar[dict[str, str]] = {}¶
The names of the options in GEMSEO mapping to those in the wrapped library.
- PG_TOL = 'pg_tol'¶
- ROUND_INTS_OPTION = 'round_ints'¶
- STOP_CRIT_NX = 'stop_crit_n_x'¶
- USE_DATABASE_OPTION = 'use_database'¶
- VERBOSE = 'verbose'¶
- X_TOL_ABS = 'xtol_abs'¶
- X_TOL_REL = 'xtol_rel'¶
- activate_progress_bar: ClassVar[bool] = True¶
Whether to activate the progress bar in the optimization log.
- descriptions: dict[str, AlgorithmDescription]¶
The description of the algorithms contained in the library.
- internal_algo_name: str | None¶
The internal name of the algorithm used currently.
It typically corresponds to the name of the algorithm in the wrapped library if any.
- opt_grammar: JSONGrammar | None¶
The grammar defining the options of the current algorithm.
- problem: OptimizationProblem¶
The optimization problem the driver library is bonded to.
- class gemseo.algos.opt.mnbi.MNBIAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='MNBI', description='', website='', handle_integer_variables=True, require_gradient=False, handle_equality_constraints=True, handle_inequality_constraints=True, handle_multiobjective=True, positive_constraints=False, problem_type=ProblemType.NON_LINEAR)[source]¶
Bases:
OptimizationAlgorithmDescription
The description of the MNBI optimization algorithm.
- Parameters:
algorithm_name (str) –
internal_algorithm_name (str) –
library_name (str) –
By default it is set to “MNBI”.
description (str) –
By default it is set to “”.
website (str) –
By default it is set to “”.
handle_integer_variables (bool) –
By default it is set to True.
require_gradient (bool) –
By default it is set to False.
handle_equality_constraints (bool) –
By default it is set to True.
handle_inequality_constraints (bool) –
By default it is set to True.
handle_multiobjective (bool) –
By default it is set to True.
positive_constraints (bool) –
By default it is set to False.
problem_type (ProblemType) –
By default it is set to “non-linear”.
- handle_equality_constraints: bool = True¶
Whether the optimization algorithm handles equality constraints.
- handle_inequality_constraints: bool = True¶
Whether the optimization algorithm handles inequality constraints.
- handle_integer_variables: bool = True¶
Whether the optimization algorithm handles integer variables.
- positive_constraints: bool = False¶
Whether the optimization algorithm requires positive constraints.
- problem_type: ProblemType = 'non-linear'¶
The type of problem (see
OptimizationProblem.ProblemType
).