gemseo / algos / opt

lib_scipy_global module

scipy.optimize global optimization library wrapper.

Classes:

ScipyGlobalOpt()

Scipy optimization library interface.

class gemseo.algos.opt.lib_scipy_global.ScipyGlobalOpt[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationLibrary

Scipy optimization library interface.

See OptimizationLibrary.

Constructor.

Generate the library dict, contains the list of algorithms with their characteristics:

  • does it require gradient

  • does it handle equality constraints

  • does it handle inequality constraints

Attributes:

COMPLEX_STEP_METHOD

DESCRIPTION

DIFFERENTIATION_METHODS

EQ_TOLERANCE

FINITE_DIFF_METHOD

F_TOL_ABS

F_TOL_REL

HANDLE_EQ_CONS

HANDLE_INEQ_CONS

INEQ_TOLERANCE

INTERNAL_NAME

LIB

LIB_COMPUTE_GRAD

LS_STEP_NB_MAX

LS_STEP_SIZE_MAX

MAX_DS_SIZE_PRINT

MAX_FUN_EVAL

MAX_ITER

MAX_TIME

NORMALIZE_DESIGN_SPACE_OPTION

OPTIONS_DIR

OPTIONS_MAP

PG_TOL

POSITIVE_CONSTRAINTS

PROBLEM_TYPE

REQUIRE_GRAD

ROUND_INTS_OPTION

STOP_CRIT_NX

USER_DEFINED_GRADIENT

USE_DATABASE_OPTION

VERBOSE

WEBSITE

X_TOL_ABS

X_TOL_REL

algorithms

Return the available algorithms.

Methods:

algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

driver_has_option(option_key)

Checks if the option key exists.

ensure_bounds(orig_func[, normalize])

Project the design vector onto the design space before execution.

execute(problem[, algo_name])

Executes the driver.

filter_adapted_algorithms(problem)

Filters the algorithms capable of solving the problem.

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database([message, status])

Retrieves the optimum from the database and builds an optimization result object from it.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

init_iter_observer(max_iter, message)

Initialize the iteration observer.

init_options_grammar(algo_name)

Initializes the options grammar.

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

new_iteration_callback()

Callback called at each new iteration, ie every time a design vector that is not already in the database is proposed by the optimizer.

COMPLEX_STEP_METHOD = 'complex_step'
DESCRIPTION = 'description'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
HANDLE_EQ_CONS = 'handle_equality_constraints'
HANDLE_INEQ_CONS = 'handle_inequality_constraints'
INEQ_TOLERANCE = 'ineq_tolerance'
INTERNAL_NAME = 'internal_algo_name'
LIB = 'lib'
LIB_COMPUTE_GRAD = True
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR = 'options'
OPTIONS_MAP = {}
PG_TOL = 'pg_tol'
POSITIVE_CONSTRAINTS = 'positive_constraints'
PROBLEM_TYPE = 'problem_type'
REQUIRE_GRAD = 'require_grad'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
WEBSITE = 'website'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

property algorithms

Return the available algorithms.

driver_has_option(option_key)

Checks if the option key exists.

Parameters

option_key – the name of the option

Returns

True if the option is in the grammar

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
  • orig_func – the original function

  • normalize – if True, use the normalized design space

Returns

the wrapped function

execute(problem, algo_name=None, **options)

Executes the driver.

Parameters
  • problem – the problem to be solved

  • algo_name – name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)

  • options – the options dict for the algorithm

filter_adapted_algorithms(problem)

Filters the algorithms capable of solving the problem.

Parameters

problem – the opt_problem to be solved

Returns

the list of adapted algorithms names

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

Parameters
  • message – Default value = None)

  • status – Default value = None)

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy

Returns

x, lower bounds, upper bounds

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
  • max_iter – maximum number of calls

  • message – message to display at the beginning

init_options_grammar(algo_name)

Initializes the options grammar.

Parameters

algo_name – name of the algorithm

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – name of the algorithm

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

Parameters

algo_name – the name of the algorithm

Returns

True if constraints must be positive

Return type

logical

static is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

Parameters
  • algo_dict – the algorithm characteristics

  • problem – the opt_problem to be solved

new_iteration_callback()

Callback called at each new iteration, ie every time a design vector that is not already in the database is proposed by the optimizer.

Iterates the progress bar, implements the stop criteria