gemseo_pymoo / problems / analytical

Show inherited members

viennet module

Viennet multi-objective problem.

This module implements the Viennet multi-objective unconstrained problem:

\[\begin{split}\begin{aligned} \text{minimize the objective function } & f_1(x, y) = (x^2 + y^2) / 2 + sin(x^2 + y^2) \\ & f_2(x, y) = (3x - 2y + 4)^2 / 8 + (x - y + 1)^2 / 27 + 15 \\ & f_3(x, y) = 1 / (x^2 + y^2 + 1) - 1.1 e^{-(x^2 + y^2)} \\ \text{with respect to the design variables }&x,\,y \\ \text{subject to the bound constraints } & -3.0 \leq x \leq 3.0\\ & -3.0 \leq y \leq 3.0 \end{aligned}\end{split}\]
class gemseo_pymoo.problems.analytical.viennet.Viennet(l_b=-3.0, u_b=3.0, initial_guess=None)[source]

Bases: OptimizationProblem

Viennet optimization problem.

The constructor.

Initialize the Viennet OptimizationProblem by defining the DesignSpace and the objective function.

Parameters:
  • l_b (float) –

    The lower bound (common value to all variables).

    By default it is set to -3.0.

  • u_b (float) –

    The upper bound (common value to all variables).

    By default it is set to 3.0.

  • initial_guess (ndarray | None) – The initial guess for the optimal solution. If None, the initial guess will be (0., 0.).

static compute_objective(design_variables)[source]

Compute the objectives of the Viennet function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The objective function value.

Return type:

ndarray

static compute_objective_jacobian(design_variables)[source]

Compute the gradient of objective function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The gradient of the objective functions wrt the design variables.

Return type:

ndarray

constraints: list[MDOFunction]

The constraints.

current_iter: int

The current iteration.

database: Database

The database to store the optimization problem data.

design_space: DesignSpace

The design space on which the optimization problem is solved.

eq_tolerance: float

The tolerance for the equality constraints.

fd_step: float

The finite differences step.

ineq_tolerance: float

The tolerance for the inequality constraints.

max_iter: int

The maximum iteration.

new_iter_observables: list[MDOFunction]

The observables to be called at each new iterate.

nonproc_constraints: list[MDOFunction]

The non-processed constraints.

nonproc_new_iter_observables: list[MDOFunction]

The non-processed observables to be called at each new iterate.

nonproc_objective: MDOFunction

The non-processed objective function.

nonproc_observables: list[MDOFunction]

The non-processed observables.

observables: list[MDOFunction]

The observables.

pb_type: ProblemType

The type of optimization problem.

preprocess_options: dict

The options to pre-process the functions.

solution: OptimizationResult | None

The solution of the optimization problem if solved; otherwise None.

stop_if_nan: bool

Whether the optimization stops when a function returns NaN.

use_standardized_objective: bool

Whether to use standardized objective for logging and post-processing.

The standardized objective corresponds to the original one expressed as a cost function to minimize. A DriverLibrary works with this standardized objective and the Database stores its values. However, for convenience, it may be more relevant to log the expression and the values of the original objective.