gemseo / algos / opt

Show inherited members

mnbi module

Modified Normal Boundary Intersection (mNBI) algorithm.

Based on [Shu07].

class gemseo.algos.opt.mnbi.BetaSubOptimOutput(f_min, x_min, w, database, n_calls)[source]

Bases: NamedTuple

An output from a beta sub optimization.

Create new instance of BetaSubOptimOutput(f_min, x_min, w, database, n_calls)

Parameters:
  • f_min (RealArray) –

  • x_min (RealArray) –

  • w (RealArray) –

  • database (Database) –

  • n_calls (int) –

database: Database

The database of the main problem.

f_min: RealArray

The coordinates in the objective space of the sub-optimization result.

n_calls: int

The number of calls to the main objective function by the optimizer during the sub-optimization.

w: RealArray

The vector w used to compute the values of beta that can be skipped in the following sub-optimizations.

x_min: RealArray

The coordinates in the design space of the sub-optimization result.

class gemseo.algos.opt.mnbi.IndividualSubOptimOutput(f_min, x_min, database, n_calls)[source]

Bases: NamedTuple

An output from a sub optimization.

Create new instance of IndividualSubOptimOutput(f_min, x_min, database, n_calls)

Parameters:
  • f_min (RealArray) –

  • x_min (RealArray) –

  • database (Database) –

  • n_calls (int) –

database: Database

The database of the main problem.

f_min: RealArray

The value of f at the design value minimizing f_i.

n_calls: int

The number of calls to f.

x_min: RealArray

The value of the design variables minimizing f_i.

class gemseo.algos.opt.mnbi.MNBI[source]

Bases: OptimizationLibrary

MNBI optimizer.

This algorithm computes the Pareto front of a multi-objective optimization problem by decomposing it into a series of constrained single-objective problems. Considering the following problem:

\[\begin{split}\begin{align} & \min_{x \in D} f(x),\\ & g(x) \leq 0,\\ & h(x) = 0 \end{align}\end{split}\]

the algorithm first finds the individual optima \((x_i^\ast)_{i=1..m}\) of the \(m\) components of the objective function \(f\). The corresponding anchor points \((f(x_i^\ast))_{i=1..m}\) are stored in a matrix \(\Phi\).

The simplex formed by the convex hull of the anchor points can be expressed as \(\Phi \beta\), where \(\beta = \{ (b_1, ..., b_m)^T | \sum_{i=1}^m b_i =1 \}\).

Given a list of vectors \(\beta\), mNBI will solve the following single-objective problems:

\[\begin{split}\begin{align} & \max_{x \in D, t \in \mathbb{R}} t,\\ & \Phi \beta + t \hat{n} \geq f(x),\\ & g(x) \leq 0,\\ & h(x) = 0 \end{align}\end{split}\]

where \(\hat{n}\) is a quasi-normal vector to the \(\Phi \beta\) simplex pointing towards the origin of the objective space. If \((x^{*}, t^{*})\) is a solution of this problem, \(x^{*}\) is proven to be at least weakly Pareto-dominant.

Let \(w = \Phi \beta + t^{*} \hat{n}\), and \(\pi\) denote the projection (in the direction of \(\hat{n}\)) on the simplex formed by the convex hull of the anchor points. If not all constraints \(\Phi \beta + t^{*} \hat{n} \geq f(x^{*})\) are active, \(x^{*}\) will weakly dominate the solution of the sub-problem for all values \(\beta_{dom}\) that verify:

\[\Phi \beta_{dom} \in \pi[(f(x^{*}) + \mathbb{R}_m^{+}) \cap (w - \mathbb{R}_m^{+})]\]

Therefore, the corresponding sub-optimizations are redundant and can be skipped to reduce run time.

Notes

The missing current values of the DesignSpace attached to the OptimizationProblem are automatically initialized with the method DesignSpace.initialize_missing_current_values().

algo_name: str | None

The name of the algorithm used currently.

descriptions: dict[str, AlgorithmDescription]

The description of the algorithms contained in the library.

internal_algo_name: str | None

The internal name of the algorithm used currently.

It typically corresponds to the name of the algorithm in the wrapped library if any.

opt_grammar: JSONGrammar | None

The grammar defining the options of the current algorithm.

problem: OptimizationProblem

The optimization problem the driver library is bonded to.

class gemseo.algos.opt.mnbi.MNBIAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='MNBI', description='', website='', handle_integer_variables=True, require_gradient=False, handle_equality_constraints=True, handle_inequality_constraints=True, handle_multiobjective=True, positive_constraints=False, problem_type=ProblemType.NON_LINEAR)[source]

Bases: OptimizationAlgorithmDescription

The description of the MNBI optimization algorithm.

Parameters:
  • algorithm_name (str) –

  • internal_algorithm_name (str) –

  • library_name (str) –

    By default it is set to “MNBI”.

  • description (str) –

    By default it is set to “”.

  • website (str) –

    By default it is set to “”.

  • handle_integer_variables (bool) –

    By default it is set to True.

  • require_gradient (bool) –

    By default it is set to False.

  • handle_equality_constraints (bool) –

    By default it is set to True.

  • handle_inequality_constraints (bool) –

    By default it is set to True.

  • handle_multiobjective (bool) –

    By default it is set to True.

  • positive_constraints (bool) –

    By default it is set to False.

  • problem_type (ProblemType) –

    By default it is set to “non-linear”.

algorithm_name: str

The name of the algorithm in GEMSEO.

handle_equality_constraints: bool = True

Whether the optimization algorithm handles equality constraints.

handle_inequality_constraints: bool = True

Whether the optimization algorithm handles inequality constraints.

handle_integer_variables: bool = True

Whether the optimization algorithm handles integer variables.

handle_multiobjective: bool = True

Whether the optimization algorithm handles multiple objectives.

internal_algorithm_name: str

The name of the algorithm in the wrapped library.

library_name: str = 'MNBI'

The name of the wrapped library.