gemseo_pymoo / problems / analytical

# chankong_haimes module¶

Chankong and Haimes multi-objective problem.

This module implements the Chankong and Haimes multi-objective problem:

\begin{split}\begin{aligned} \text{minimize the objective function } & f_1(x, y) = 2 + (x - 2)^2 + (y - 1)^2 \\ & f_2(x, y) = 9x - (y - 1)^2 \\ \text{with respect to the design variables }&x,\,y \\ \text{subject to the general constraints } & g_1(x, y) = x^2 + y^2 \leq 225.0\\ & g_2(x, y) = x - 3y + 10 \leq 0.0\\ \text{subject to the bound constraints } & -20.0 \leq x \leq 20.\\ & -20.0 \leq y \leq 20. \end{aligned}\end{split}

Chankong, V., & Haimes, Y. Y. (2008). Multiobjective decision making: theory and methodology. Courier Dover Publications.

class gemseo_pymoo.problems.analytical.chankong_haimes.ChankongHaimes(l_b=-20.0, u_b=20.0, initial_guess=None)[source]

Chankong and Haimes optimization problem.

The constructor.

Initialize the ChankongHaimes OptimizationProblem by defining the DesignSpace and the objective and constraints functions.

Parameters:
• l_b (float) –

The lower bound (common value to all variables).

By default it is set to -20.0.

• u_b (float) –

The upper bound (common value to all variables).

By default it is set to 20.0.

• initial_guess (ndarray | None) – The initial guess for the optimal solution.

static compute_constraint_1(design_variables)[source]

Compute the first constraint function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The first constraint’s value.

Return type:

ndarray

static compute_constraint_1_jacobian(design_variables)[source]

Compute the first inequality constraint jacobian.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The gradient of the first constraint function wrt the design variables.

Return type:

ndarray

static compute_constraint_2(design_variables)[source]

Compute the second constraint function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The second constraint’s value.

Return type:

ndarray

static compute_constraint_2_jacobian(design_variables)[source]

Compute the second inequality constraint jacobian.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The gradient of the second constraint function wrt the design variables.

Return type:

ndarray

static compute_objective(design_variables)[source]

Compute the objectives of the Chankong and Haimes function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The objective function value.

Return type:

ndarray

static compute_objective_jacobian(design_variables)[source]

Compute the gradient of objective function.

Parameters:

design_variables (ndarray) – The design variables vector.

Returns:

The gradient of the objective functions wrt the design variables.

Return type:

ndarray

constraints: list[MDOFunction]

The constraints.

current_iter: int

The current iteration.

database: Database

The database to store the optimization problem data.

design_space: DesignSpace

The design space on which the optimization problem is solved.

eq_tolerance: float

The tolerance for the equality constraints.

fd_step: float

The finite differences step.

ineq_tolerance: float

The tolerance for the inequality constraints.

max_iter: int

The maximum iteration.

minimize_objective: bool

Whether to maximize the objective.

new_iter_observables: list[MDOFunction]

The observables to be called at each new iterate.

nonproc_constraints: list[MDOFunction]

The non-processed constraints.

nonproc_new_iter_observables: list[MDOFunction]

The non-processed observables to be called at each new iterate.

nonproc_objective: MDOFunction

The non-processed objective function.

nonproc_observables: list[MDOFunction]

The non-processed observables.

observables: list[MDOFunction]

The observables.

pb_type: str

The type of optimization problem.

preprocess_options: dict

The options to pre-process the functions.

solution: OptimizationResult

The solution of the optimization problem.

stop_if_nan: bool

Whether the optimization stops when a function returns NaN.

use_standardized_objective: bool

Whether to use standardized objective for logging and post-processing.

The standardized objective corresponds to the original one expressed as a cost function to minimize. A DriverLibrary works with this standardized objective and the Database stores its values. However, for convenience, it may be more relevant to log the expression and the values of the original objective.