gemseo / algos / opt

Hide inherited members

optimization_library module

Optimization library wrappers base class.

class gemseo.algos.opt.optimization_library.OptimizationAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='', description='', website='', handle_integer_variables=False, require_gradient=False, handle_equality_constraints=False, handle_inequality_constraints=False, handle_multiobjective=False, positive_constraints=False, problem_type=ProblemType.NON_LINEAR)[source]

Bases: DriverDescription

The description of an optimization algorithm.

Parameters:
  • algorithm_name (str) –

  • internal_algorithm_name (str) –

  • library_name (str) –

    By default it is set to “”.

  • description (str) –

    By default it is set to “”.

  • website (str) –

    By default it is set to “”.

  • handle_integer_variables (bool) –

    By default it is set to False.

  • require_gradient (bool) –

    By default it is set to False.

  • handle_equality_constraints (bool) –

    By default it is set to False.

  • handle_inequality_constraints (bool) –

    By default it is set to False.

  • handle_multiobjective (bool) –

    By default it is set to False.

  • positive_constraints (bool) –

    By default it is set to False.

  • problem_type (ProblemType) –

    By default it is set to “non-linear”.

algorithm_name: str

The name of the algorithm in GEMSEO.

description: str = ''

A description of the algorithm.

handle_equality_constraints: bool = False

Whether the optimization algorithm handles equality constraints.

handle_inequality_constraints: bool = False

Whether the optimization algorithm handles inequality constraints.

handle_integer_variables: bool = False

Whether the optimization algorithm handles integer variables.

handle_multiobjective: bool = False

Whether the optimization algorithm handles multiple objectives.

internal_algorithm_name: str

The name of the algorithm in the wrapped library.

library_name: str = ''

The name of the wrapped library.

positive_constraints: bool = False

Whether the optimization algorithm requires positive constraints.

problem_type: ProblemType = 'non-linear'

The type of problem (see OptimizationProblem.ProblemType).

require_gradient: bool = False

Whether the optimization algorithm requires the gradient.

website: str = ''

The website of the wrapped library or algorithm.

class gemseo.algos.opt.optimization_library.OptimizationLibrary[source]

Bases: DriverLibrary

Base optimization library defining a collection of optimization algorithms.

Typically used as:

  1. Instantiate an OptimizationLibrary.

  2. Select the algorithm with algo_name.

  3. Solve an OptimizationProblem with execute().

Notes

The missing current values of the DesignSpace attached to the OptimizationProblem are automatically initialized with the method DesignSpace.initialize_missing_current_values().

class ApproximationMode(value)

Bases: StrEnum

The approximation derivation modes.

CENTERED_DIFFERENCES = 'centered_differences'

The centered differences method used to approximate the Jacobians by perturbing each variable with a small real number.

COMPLEX_STEP = 'complex_step'

The complex step method used to approximate the Jacobians by perturbing each variable with a small complex number.

FINITE_DIFFERENCES = 'finite_differences'

The finite differences method used to approximate the Jacobians by perturbing each variable with a small real number.

class DifferentiationMethod(value)

Bases: StrEnum

The differentiation methods.

CENTERED_DIFFERENCES = 'centered_differences'
COMPLEX_STEP = 'complex_step'
FINITE_DIFFERENCES = 'finite_differences'
USER_GRAD = 'user'
algorithm_handles_eqcstr(algo_name)[source]

Check if an algorithm handles equality constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm handles equality constraints.

Return type:

bool

algorithm_handles_ineqcstr(algo_name)[source]

Check if an algorithm handles inequality constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm handles inequality constraints.

Return type:

bool

clear_listeners()

Remove the listeners from the database.

Return type:

None

deactivate_progress_bar()

Deactivate the progress bar.

Return type:

None

driver_has_option(option_name)

Check the existence of an option.

Parameters:

option_name (str) – The name of the option.

Returns:

Whether the option exists.

Return type:

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters:
  • orig_func – The original function.

  • normalize (bool) –

    Whether to use the normalized design space.

    By default it is set to True.

Returns:

A function calling the original function with the input data projected onto the design space.

execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, max_design_space_dimension_to_log=40, **options)

Execute the driver.

Parameters:
  • problem (OptimizationProblem) – The problem to be solved.

  • algo_name (str | None) – The name of the algorithm. If None, use the algo_name attribute which may have been set by the factory.

  • eval_obs_jac (bool) –

    Whether to evaluate the Jacobian of the observables.

    By default it is set to False.

  • skip_int_check (bool) –

    Whether to skip the integer variable handling check of the selected algorithm.

    By default it is set to False.

  • max_design_space_dimension_to_log (int) –

    The maximum dimension of a design space to be logged. If this number is higher than the dimension of the design space then the design space will not be logged.

    By default it is set to 40.

  • **options (DriverLibOptionType) – The options for the algorithm.

Returns:

The optimization result.

Raises:

ValueError – If algo_name was not either set by the factory or given as an argument.

Return type:

OptimizationResult

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

Parameters:

problem (Any) – The problem to be solved.

Returns:

The names of the algorithms adapted to this problem.

Return type:

list[str]

finalize_iter_observer()

Finalize the iteration observer.

Return type:

None

get_optimum_from_database(message=None, status=None)

Return the optimization result from the database.

Return type:

OptimizationResult

get_right_sign_constraints()[source]

Transform the problem constraints into their opposite sign counterpart.

This is done if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds, as_dict=False)

Return the initial design variable values and their lower and upper bounds.

Parameters:
  • normalize_ds (bool) – Whether to normalize the design variables.

  • as_dict (bool) –

    Whether to return dictionaries instead of NumPy arrays.

    By default it is set to False.

Returns:

The initial values of the design variables, their lower bounds, and their upper bounds.

Return type:

tuple[ndarray, ndarray, ndarray] | tuple[dict[str, ndarray], dict[str, ndarray], dict[str, ndarray]]

init_iter_observer(max_iter, message='')

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters:
  • max_iter (int) – The maximum number of iterations.

  • message (str) –

    The message to display at the beginning of the progress bar status.

    By default it is set to “”.

Raises:

ValueError – If max_iter is lower than one.

Return type:

None

init_options_grammar(algo_name)

Initialize the options’ grammar.

Parameters:

algo_name (str) – The name of the algorithm.

Return type:

JSONGrammar

is_algo_requires_positive_cstr(algo_name)[source]

Check if an algorithm requires positive constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm requires positive constraints.

Return type:

bool

classmethod is_algorithm_suited(algorithm_description, problem)

Check if an algorithm is suited to a problem according to its description.

Parameters:
  • algorithm_description (AlgorithmDescription) – The description of the algorithm.

  • problem (Any) – The problem to be solved.

Returns:

Whether the algorithm is suited to the problem.

Return type:

bool

new_iteration_callback(x_vect)[source]

Verify the design variable and objective value stopping criteria.

Parameters:

x_vect (ndarray) – The design variables values.

Raises:
  • FtolReached – If the defined relative or absolute function tolerance is reached.

  • XtolReached – If the defined relative or absolute x tolerance is reached.

Return type:

None

requires_gradient(driver_name)

Check if a driver requires the gradient.

Parameters:

driver_name (str) – The name of the driver.

Returns:

Whether the driver requires the gradient.

Return type:

bool

EQ_TOLERANCE = 'eq_tolerance'
EVAL_OBS_JAC_OPTION = 'eval_obs_jac'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
INEQ_TOLERANCE = 'ineq_tolerance'
LIBRARY_NAME: ClassVar[str | None] = None

The name of the interfaced library.

LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR: ClassVar[str | Path] = 'options'

The name of the directory containing the files of the grammars of the options.

OPTIONS_MAP: ClassVar[dict[str, str]] = {}

The names of the options in GEMSEO mapping to those in the wrapped library.

PG_TOL = 'pg_tol'
ROUND_INTS_OPTION = 'round_ints'
SCALING_THRESHOLD: Final[str] = 'scaling_threshold'
STOP_CRIT_NX = 'stop_crit_n_x'
USE_DATABASE_OPTION = 'use_database'
USE_ONE_LINE_PROGRESS_BAR: ClassVar[bool] = False

Whether to use a one line progress bar.

VERBOSE = 'verbose'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
activate_progress_bar: ClassVar[bool] = True

Whether to activate the progress bar in the optimization log.

algo_name: str | None

The name of the algorithm used currently.

property algorithms: list[str]

The available algorithms.

descriptions: dict[str, AlgorithmDescription]

The description of the algorithms contained in the library.

internal_algo_name: str | None

The internal name of the algorithm used currently.

It typically corresponds to the name of the algorithm in the wrapped library if any.

opt_grammar: JSONGrammar | None

The grammar defining the options of the current algorithm.

problem: OptimizationProblem

The optimization problem the driver library is bonded to.