gemseo / problems / analytical

# power_2 module¶

class gemseo.problems.analytical.power_2.Power2(exception_error=False, initial_value=1.0)[source]

Power2 is a very basic quadratic analytical OptimizationProblem:

• Objective to minimize: $$x_0^2 + x_1^2 + x_2^2$$

• Inequality constraint 1: $$x_0^3 - 0.5 > 0$$

• Inequality constraint 2: $$x_1^3 - 0.5 > 0$$

• Equality constraint: $$x_2^3 - 0.9 = 0$$

• Analytical optimum: $$x^*=(0.5^{1/3}, 0.5^{1/3}, 0.9^{1/3})$$

Parameters:
• exception_error (bool) –

Whether to raise an error when calling the objective; useful for tests.

By default it is set to False.

• initial_value (float) –

The initial design value of the problem.

By default it is set to 1.0.

static eq_constraint(x_dv)[source]

Compute the equality constraint $$x_2^3 - 0.9 = 0$$.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the equality constraint.

Return type:

ndarray

static eq_constraint_jac(x_dv)[source]

Compute the gradient of the equality constraint.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the gradient of the equality constraint.

Return type:

ndarray

static get_solution()[source]

Return the analytical solution of the problem.

Returns:

The theoretical optimum of the problem.

Return type:
static ineq_constraint1(x_dv)[source]

Compute the first inequality constraint $$x_0^3 - 0.5 > 0$$.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the first inequality constraint.

Return type:

ndarray

static ineq_constraint1_jac(x_dv)[source]

Compute the gradient of the first inequality constraint.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the gradient of the first inequality constraint.

Return type:

ndarray

static ineq_constraint2(x_dv)[source]

Compute the second inequality constraint $$x_1^3 - 0.5 > 0$$.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the second inequality constraint.

Return type:

ndarray

static ineq_constraint2_jac(x_dv)[source]

Compute the gradient of the second inequality constraint.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the gradient of the second inequality constraint.

Return type:

ndarray

pow2(x_dv)[source]

Compute the objective $$x_0^2 + x_1^2 + x_2^2$$.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The objective value.

Raises:

ValueError – When exception_error is True and the method has already been called three times.

Return type:

ndarray

static pow2_jac(x_dv)[source]

Compute the gradient of the objective.

Parameters:

x_dv (ndarray) – The design variable vector.

Returns:

The value of the objective gradient.

Return type:

ndarray

constraints: list[MDOFunction]

The constraints.

current_iter: int

The current iteration.

database: Database

The database to store the optimization problem data.

design_space: DesignSpace

The design space on which the optimization problem is solved.

eq_tolerance: float

The tolerance for the equality constraints.

fd_step: float

The finite differences step.

ineq_tolerance: float

The tolerance for the inequality constraints.

max_iter: int

The maximum iteration.

minimize_objective: bool

Whether to maximize the objective.

new_iter_observables: list[MDOFunction]

The observables to be called at each new iterate.

nonproc_constraints: list[MDOFunction]

The non-processed constraints.

nonproc_new_iter_observables: list[MDOFunction]

The non-processed observables to be called at each new iterate.

nonproc_objective: MDOFunction

The non-processed objective function.

nonproc_observables: list[MDOFunction]

The non-processed observables.

observables: list[MDOFunction]

The observables.

pb_type: str

The type of optimization problem.

preprocess_options: dict

The options to pre-process the functions.

solution: OptimizationResult

The solution of the optimization problem.

stop_if_nan: bool

Whether the optimization stops when a function returns NaN.

use_standardized_objective: bool

Whether to use standardized objective for logging and post-processing.

The standardized objective corresponds to the original one expressed as a cost function to minimize. A DriverLibrary works with this standardized objective and the Database stores its values. However, for convenience, it may be more relevant to log the expression and the values of the original objective.