gemseo / algos / opt

lib_scipy module

scipy.optimize optimization library wrapper.

Classes:

ScipyOpt()

Scipy optimization library interface.

class gemseo.algos.opt.lib_scipy.ScipyOpt[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationLibrary

Scipy optimization library interface.

See OptimizationLibrary.

Constructor.

Generate the library dict, contains the list of algorithms with their characteristics:

  • does it require gradient

  • does it handle equality constraints

  • does it handle inequality constraints

Attributes:

COMPLEX_STEP_METHOD

DESCRIPTION

DIFFERENTIATION_METHODS

EQ_TOLERANCE

FINITE_DIFF_METHOD

F_TOL_ABS

F_TOL_REL

HANDLE_EQ_CONS

HANDLE_INEQ_CONS

INEQ_TOLERANCE

INTERNAL_NAME

LIB

LIB_COMPUTE_GRAD

LS_STEP_NB_MAX

LS_STEP_SIZE_MAX

MAX_DS_SIZE_PRINT

MAX_FUN_EVAL

MAX_ITER

MAX_TIME

NORMALIZE_DESIGN_SPACE_OPTION

OPTIONS_DIR

OPTIONS_MAP

PG_TOL

POSITIVE_CONSTRAINTS

PROBLEM_TYPE

REQUIRE_GRAD

ROUND_INTS_OPTION

STOP_CRIT_NX

USER_DEFINED_GRADIENT

USE_DATABASE_OPTION

VERBOSE

WEBSITE

X_TOL_ABS

X_TOL_REL

algorithms

The available algorithms.

Methods:

algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

deactivate_progress_bar()

Deactivate the progress bar.

driver_has_option(option_key)

Check if the option key exists.

ensure_bounds(orig_func[, normalize])

Project the design vector onto the design space before execution.

execute(problem[, algo_name])

Executes the driver.

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database([message, status])

Retrieves the optimum from the database and builds an optimization result object from it.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

init_iter_observer(max_iter, message)

Initialize the iteration observer.

init_options_grammar(algo_name)

Initialize the options grammar.

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

new_iteration_callback([x_vect])

raises FtolReached

If the defined relative or absolute function tolerance is reached.

COMPLEX_STEP_METHOD = 'complex_step'
DESCRIPTION = 'description'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
HANDLE_EQ_CONS = 'handle_equality_constraints'
HANDLE_INEQ_CONS = 'handle_inequality_constraints'
INEQ_TOLERANCE = 'ineq_tolerance'
INTERNAL_NAME = 'internal_algo_name'
LIB = 'lib'
LIB_COMPUTE_GRAD = True
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR = 'options'
OPTIONS_MAP = {'max_fun_eval': 'maxfun', 'max_ls_step_nb': 'maxls', 'max_ls_step_size': 'stepmx', 'pg_tol': 'gtol'}
PG_TOL = 'pg_tol'
POSITIVE_CONSTRAINTS = 'positive_constraints'
PROBLEM_TYPE = 'problem_type'
REQUIRE_GRAD = 'require_grad'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
WEBSITE = 'website'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

property algorithms

The available algorithms.

deactivate_progress_bar()

Deactivate the progress bar.

Return type

None

driver_has_option(option_key)

Check if the option key exists.

Parameters

option_key (str) – The name of the option.

Returns

Whether the option is in the grammar.

Return type

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
  • orig_func – the original function

  • normalize

    if True, use the normalized design space

    By default it is set to True.

Returns

the wrapped function

execute(problem, algo_name=None, **options)

Executes the driver.

Parameters
  • problem – the problem to be solved

  • algo_name

    name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)

    By default it is set to None.

  • options – the options dict for the algorithm

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

Parameters

problem (Any) – The opt_problem to be solved.

Returns

The list of adapted algorithms names.

Return type

bool

finalize_iter_observer()

Finalize the iteration observer.

Return type

None

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

Parameters
  • message

    Default value = None)

    By default it is set to None.

  • status

    Default value = None)

    By default it is set to None.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy

Returns

x, lower bounds, upper bounds

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
  • max_iter (int) – The maximum number of iterations.

  • message (str) – The message to display at the beginning.

Raises

ValueError – If the max_iter is not greater than or equal to one.

Return type

None

init_options_grammar(algo_name)

Initialize the options grammar.

Parameters

algo_name (str) – The name of the algorithm.

Return type

gemseo.core.grammars.json_grammar.JSONGrammar

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – name of the algorithm

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

Parameters

algo_name – the name of the algorithm

Returns

True if constraints must be positive

Return type

logical

static is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

Parameters
  • algo_dict – the algorithm characteristics

  • problem – the opt_problem to be solved

new_iteration_callback(x_vect=None)
Raises
  • FtolReached – If the defined relative or absolute function tolerance is reached.

  • XtolReached – If the defined relative or absolute x tolerance is reached.

Parameters

x_vect (Optional[numpy.ndarray]) –

By default it is set to None.

Return type

None