gemseo / algos / opt

lib_pseven module

Wrapper for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Classes:

DiffScheme(value)

The differentiation schemes of pSeven.

DiffType(value)

The differentiation types of pSeven.

GlobalMethod(value)

The globalization methods of pSeven.

LogLevel(value)

The logging levels of pSeven.

PSevenOpt()

Interface for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Smoothness(value)

The functions smoothness levels of pSeven.

class gemseo.algos.opt.lib_pseven.DiffScheme(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The differentiation schemes of pSeven.

Attributes:

ADAPTIVE

AUTO

FIRST_ORDER

SECOND_ORDER

ADAPTIVE = 2
AUTO = 3
FIRST_ORDER = 0
SECOND_ORDER = 1
class gemseo.algos.opt.lib_pseven.DiffType(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The differentiation types of pSeven.

Attributes:

AUTO

FRAMED

NUMERICAL

AUTO = 2
FRAMED = 1
NUMERICAL = 0
class gemseo.algos.opt.lib_pseven.GlobalMethod(value)[source]

Bases: enum.Enum

The globalization methods of pSeven.

A item is represented with the name of its key.

Attributes:

MS

PM

RL

MS = 2
PM = 1
RL = 0
class gemseo.algos.opt.lib_pseven.LogLevel(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The logging levels of pSeven.

Attributes:

DEBUG

ERROR

FATAL

INFO

WARN

DEBUG = 0
ERROR = 3
FATAL = 4
INFO = 1
WARN = 2
class gemseo.algos.opt.lib_pseven.PSevenOpt[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationLibrary

Interface for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Constructor.

Attributes:

COMPLEX_STEP_METHOD

DESCRIPTION

DIFFERENTIATION_METHODS

EQ_TOLERANCE

FINITE_DIFF_METHOD

F_TOL_ABS

F_TOL_REL

HANDLE_EQ_CONS

HANDLE_INEQ_CONS

INEQ_TOLERANCE

INTERNAL_NAME

LIB

LIB_COMPUTE_GRAD

LS_STEP_NB_MAX

LS_STEP_SIZE_MAX

MAX_DS_SIZE_PRINT

MAX_FUN_EVAL

MAX_ITER

MAX_TIME

NORMALIZE_DESIGN_SPACE_OPTION

OPTIONS_DIR

OPTIONS_MAP

PG_TOL

POSITIVE_CONSTRAINTS

PROBLEM_TYPE

REQUIRE_GRAD

ROUND_INTS_OPTION

STOP_CRIT_NX

USER_DEFINED_GRADIENT

USE_DATABASE_OPTION

VERBOSE

WEBSITE

X_TOL_ABS

X_TOL_REL

algorithms

Return the available algorithms.

Methods:

algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

driver_has_option(option_key)

Checks if the option key exists.

ensure_bounds(orig_func[, normalize])

Project the design vector onto the design space before execution.

execute(problem[, algo_name])

Executes the driver.

filter_adapted_algorithms(problem)

Filters the algorithms capable of solving the problem.

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database([message, status])

Retrieves the optimum from the database and builds an optimization result object from it.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

init_iter_observer(max_iter, message)

Initialize the iteration observer.

init_options_grammar(algo_name)

Initializes the options grammar.

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

new_iteration_callback()

Callback called at each new iteration, ie every time a design vector that is not already in the database is proposed by the optimizer.

Return type

None

COMPLEX_STEP_METHOD = 'complex_step'
DESCRIPTION = 'description'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
HANDLE_EQ_CONS = 'handle_equality_constraints'
HANDLE_INEQ_CONS = 'handle_inequality_constraints'
INEQ_TOLERANCE = 'ineq_tolerance'
INTERNAL_NAME = 'internal_algo_name'
LIB = 'lib'
LIB_COMPUTE_GRAD = True
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR = 'options'
OPTIONS_MAP = {'constraints_smoothness': 'GTOpt/ConstraintsSmoothness', 'detect_nan_clusters': 'GTOpt/DetectNaNClusters', 'deterministic': 'GTOpt/Deterministic', 'diff_scheme': 'GTOpt/DiffScheme', 'diff_step': 'GTOpt/NumDiffStepSize', 'diff_type': 'GTOpt/DiffType', 'ensure_feasibility': 'GTOpt/EnsureFeasibility', 'global_phase_intensity': 'GTOpt/GlobalPhaseIntensity', 'grad_tol': 'GTOpt/GradientTolerance', 'grad_tol_is_abs': 'GTOpt/AbsoluteGradientTolerance', 'local_search': 'GTOpt/LocalSearch', 'log_level': 'GTOpt/LogLevel', 'max_batch_size': 'GTOpt/BatchSize', 'max_expensive_func_iter': 'GTOpt/MaximumExpensiveIterations', 'max_func_iter': 'GTOpt/MaximumIterations', 'max_threads': 'GTOpt/MaxParallel', 'objectives_smoothness': 'GTOpt/ObjectivesSmoothness', 'restore_analytic_func': 'GTOpt/RestoreAnalyticResponses', 'seed': 'GTOpt/Seed', 'time_limit': 'GTOpt/TimeLimit', 'verbose_log': 'GTOpt/VerboseOutput'}
PG_TOL = 'pg_tol'
POSITIVE_CONSTRAINTS = 'positive_constraints'
PROBLEM_TYPE = 'problem_type'
REQUIRE_GRAD = 'require_grad'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
WEBSITE = 'website'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

property algorithms

Return the available algorithms.

driver_has_option(option_key)

Checks if the option key exists.

Parameters

option_key – the name of the option

Returns

True if the option is in the grammar

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
  • orig_func – the original function

  • normalize – if True, use the normalized design space

Returns

the wrapped function

execute(problem, algo_name=None, **options)

Executes the driver.

Parameters
  • problem – the problem to be solved

  • algo_name – name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)

  • options – the options dict for the algorithm

filter_adapted_algorithms(problem)

Filters the algorithms capable of solving the problem.

Parameters

problem – the opt_problem to be solved

Returns

the list of adapted algorithms names

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

Parameters
  • message – Default value = None)

  • status – Default value = None)

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy

Returns

x, lower bounds, upper bounds

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
  • max_iter – maximum number of calls

  • message – message to display at the beginning

init_options_grammar(algo_name)

Initializes the options grammar.

Parameters

algo_name – name of the algorithm

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – name of the algorithm

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

Parameters

algo_name – the name of the algorithm

Returns

True if constraints must be positive

Return type

logical

static is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

Parameters
  • algo_dict – the algorithm characteristics

  • problem – the opt_problem to be solved

new_iteration_callback()

Callback called at each new iteration, ie every time a design vector that is not already in the database is proposed by the optimizer.

Iterates the progress bar, implements the stop criteria

class gemseo.algos.opt.lib_pseven.Smoothness(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The functions smoothness levels of pSeven.

Attributes:

AUTO

NOISY

SMOOTH

AUTO = 2
NOISY = 1
SMOOTH = 0