gemseo / algos / opt

lib_pseven module

Wrapper for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Classes:

DiffScheme(value)

The differentiation schemes of pSeven.

DiffType(value)

The differentiation types of pSeven.

GlobalMethod(value)

The globalization methods of pSeven.

LogLevel(value)

The logging levels of pSeven.

PSevenOpt()

Interface for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Smoothness(value)

The functions smoothness levels of pSeven.

class gemseo.algos.opt.lib_pseven.DiffScheme(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The differentiation schemes of pSeven.

Attributes:

ADAPTIVE

AUTO

FIRST_ORDER

SECOND_ORDER

ADAPTIVE = 2
AUTO = 3
FIRST_ORDER = 0
SECOND_ORDER = 1
class gemseo.algos.opt.lib_pseven.DiffType(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The differentiation types of pSeven.

Attributes:

AUTO

FRAMED

NUMERICAL

AUTO = 2
FRAMED = 1
NUMERICAL = 0
class gemseo.algos.opt.lib_pseven.GlobalMethod(value)[source]

Bases: enum.Enum

The globalization methods of pSeven.

A item is represented with the name of its key.

Attributes:

MS

PM

RL

MS = 2
PM = 1
RL = 0
class gemseo.algos.opt.lib_pseven.LogLevel(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The logging levels of pSeven.

Attributes:

DEBUG

ERROR

FATAL

INFO

WARN

DEBUG = 0
ERROR = 3
FATAL = 4
INFO = 1
WARN = 2
class gemseo.algos.opt.lib_pseven.PSevenOpt[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationLibrary

Interface for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Constructor.

Attributes:

COMPLEX_STEP_METHOD

DESCRIPTION

DIFFERENTIATION_METHODS

EQ_TOLERANCE

FINITE_DIFF_METHOD

F_TOL_ABS

F_TOL_REL

HANDLE_EQ_CONS

HANDLE_INEQ_CONS

INEQ_TOLERANCE

INTERNAL_NAME

LIB

LIB_COMPUTE_GRAD

LS_STEP_NB_MAX

LS_STEP_SIZE_MAX

MAX_DS_SIZE_PRINT

MAX_FUN_EVAL

MAX_ITER

MAX_TIME

NORMALIZE_DESIGN_SPACE_OPTION

OPTIONS_DIR

OPTIONS_MAP

PG_TOL

POSITIVE_CONSTRAINTS

PROBLEM_TYPE

REQUIRE_GRAD

ROUND_INTS_OPTION

STOP_CRIT_NX

USER_DEFINED_GRADIENT

USE_DATABASE_OPTION

VERBOSE

WEBSITE

X_TOL_ABS

X_TOL_REL

algorithms

The available algorithms.

Methods:

algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

deactivate_progress_bar()

Deactivate the progress bar.

driver_has_option(option_key)

Check if the option key exists.

ensure_bounds(orig_func[, normalize])

Project the design vector onto the design space before execution.

execute(problem[, algo_name])

Executes the driver.

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

finalize_iter_observer()

Finalize the iteration observer.

get_optimum_from_database([message, status])

Retrieves the optimum from the database and builds an optimization result object from it.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

init_iter_observer(max_iter, message)

Initialize the iteration observer.

init_options_grammar(algo_name)

Initialize the options grammar.

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

new_iteration_callback([x_vect])

raises FtolReached

If the defined relative or absolute function tolerance is reached.

Return type

None

COMPLEX_STEP_METHOD = 'complex_step'
DESCRIPTION = 'description'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
HANDLE_EQ_CONS = 'handle_equality_constraints'
HANDLE_INEQ_CONS = 'handle_inequality_constraints'
INEQ_TOLERANCE = 'ineq_tolerance'
INTERNAL_NAME = 'internal_algo_name'
LIB = 'lib'
LIB_COMPUTE_GRAD = True
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR = 'options'
OPTIONS_MAP = {'constraints_smoothness': 'GTOpt/ConstraintsSmoothness', 'detect_nan_clusters': 'GTOpt/DetectNaNClusters', 'deterministic': 'GTOpt/Deterministic', 'diff_scheme': 'GTOpt/DiffScheme', 'diff_step': 'GTOpt/NumDiffStepSize', 'diff_type': 'GTOpt/DiffType', 'ensure_feasibility': 'GTOpt/EnsureFeasibility', 'global_phase_intensity': 'GTOpt/GlobalPhaseIntensity', 'grad_tol': 'GTOpt/GradientTolerance', 'grad_tol_is_abs': 'GTOpt/AbsoluteGradientTolerance', 'local_search': 'GTOpt/LocalSearch', 'log_level': 'GTOpt/LogLevel', 'max_batch_size': 'GTOpt/BatchSize', 'max_expensive_func_iter': 'GTOpt/MaximumExpensiveIterations', 'max_func_iter': 'GTOpt/MaximumIterations', 'max_threads': 'GTOpt/MaxParallel', 'objectives_smoothness': 'GTOpt/ObjectivesSmoothness', 'restore_analytic_func': 'GTOpt/RestoreAnalyticResponses', 'seed': 'GTOpt/Seed', 'time_limit': 'GTOpt/TimeLimit', 'verbose_log': 'GTOpt/VerboseOutput'}
PG_TOL = 'pg_tol'
POSITIVE_CONSTRAINTS = 'positive_constraints'
PROBLEM_TYPE = 'problem_type'
REQUIRE_GRAD = 'require_grad'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
WEBSITE = 'website'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
algorithm_handles_eqcstr(algo_name)

Returns True if the algorithms handles equality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

algorithm_handles_ineqcstr(algo_name)

Returns True if the algorithms handles inequality constraints.

Parameters

algo_name – the name of the algorithm

Returns

True or False

property algorithms

The available algorithms.

deactivate_progress_bar()

Deactivate the progress bar.

Return type

None

driver_has_option(option_key)

Check if the option key exists.

Parameters

option_key (str) – The name of the option.

Returns

Whether the option is in the grammar.

Return type

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
  • orig_func – the original function

  • normalize

    if True, use the normalized design space

    By default it is set to True.

Returns

the wrapped function

execute(problem, algo_name=None, **options)

Executes the driver.

Parameters
  • problem – the problem to be solved

  • algo_name

    name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)

    By default it is set to None.

  • options – the options dict for the algorithm

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

Parameters

problem (Any) – The opt_problem to be solved.

Returns

The list of adapted algorithms names.

Return type

bool

finalize_iter_observer()

Finalize the iteration observer.

Return type

None

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

Parameters
  • message

    Default value = None)

    By default it is set to None.

  • status

    Default value = None)

    By default it is set to None.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy

Returns

x, lower bounds, upper bounds

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
  • max_iter (int) – The maximum number of iterations.

  • message (str) – The message to display at the beginning.

Raises

ValueError – If the max_iter is not greater than or equal to one.

Return type

None

init_options_grammar(algo_name)

Initialize the options grammar.

Parameters

algo_name (str) – The name of the algorithm.

Return type

gemseo.core.grammars.json_grammar.JSONGrammar

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – name of the algorithm

is_algo_requires_positive_cstr(algo_name)

Returns True if the algorithm requires positive constraints False otherwise.

Parameters

algo_name – the name of the algorithm

Returns

True if constraints must be positive

Return type

logical

static is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

Parameters
  • algo_dict – the algorithm characteristics

  • problem – the opt_problem to be solved

new_iteration_callback(x_vect=None)
Raises
  • FtolReached – If the defined relative or absolute function tolerance is reached.

  • XtolReached – If the defined relative or absolute x tolerance is reached.

Parameters

x_vect (Optional[numpy.ndarray]) –

By default it is set to None.

Return type

None

class gemseo.algos.opt.lib_pseven.Smoothness(value)[source]

Bases: gemseo.utils.base_enum.CamelCaseEnum

The functions smoothness levels of pSeven.

Attributes:

AUTO

NOISY

SMOOTH

AUTO = 2
NOISY = 1
SMOOTH = 0