gemseo / algos / opt

lib_snopt module

SNOPT optimization library wrapper.

class gemseo.algos.opt.lib_snopt.SNOPTAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='SNOPT', description='', website='', handle_integer_variables=False, require_gradient=False, handle_equality_constraints=False, handle_inequality_constraints=False, handle_multiobjective=False, positive_constraints=False, problem_type='non-linear')[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationAlgorithmDescription

The description of an optimization algorithm from the SNOPT library.

Parameters
  • algorithm_name (str) –

  • internal_algorithm_name (str) –

  • library_name (str) –

    By default it is set to SNOPT.

  • description (str) –

    By default it is set to .

  • website (str) –

    By default it is set to .

  • handle_integer_variables (bool) –

    By default it is set to False.

  • require_gradient (bool) –

    By default it is set to False.

  • handle_equality_constraints (bool) –

    By default it is set to False.

  • handle_inequality_constraints (bool) –

    By default it is set to False.

  • handle_multiobjective (bool) –

    By default it is set to False.

  • positive_constraints (bool) –

    By default it is set to False.

  • problem_type (str) –

    By default it is set to non-linear.

Return type

None

algorithm_name: str

The name of the algorithm in GEMSEO.

description: str = ''

A description of the algorithm.

handle_equality_constraints: bool = False

Whether the optimization algorithm handles equality constraints.

handle_inequality_constraints: bool = False

Whether the optimization algorithm handles inequality constraints.

handle_integer_variables: bool = False

Whether the optimization algorithm handles integer variables.

handle_multiobjective: bool = False

Whether the optimization algorithm handles multiple objectives.

internal_algorithm_name: str

The name of the algorithm in the wrapped library.

library_name: str = 'SNOPT'

The name of the wrapped library.

positive_constraints: bool = False

Whether the optimization algorithm requires positive constraints.

problem_type: str = 'non-linear'

The type of problem (see OptimizationProblem.AVAILABLE_PB_TYPES).

require_gradient: bool = False

Whether the optimization algorithm requires the gradient.

website: str = ''

The website of the wrapped library or algorithm.

class gemseo.algos.opt.lib_snopt.SnOpt[source]

Bases: gemseo.algos.opt.opt_lib.OptimizationLibrary

SNOPT optimization library interface.

See OptimizationLibrary.

Constructor.

Generate the library dict, contains the list of algorithms with their characteristics:

  • does it require gradient,

  • does it handle equality constraints,

  • does it handle inequality constraints.

algorithm_handles_eqcstr(algo_name)

Check if an algorithm handles equality constraints.

Parameters

algo_name (str) – The name of the algorithm.

Returns

Whether the algorithm handles equality constraints.

Return type

bool

algorithm_handles_ineqcstr(algo_name)

Check if an algorithm handles inequality constraints.

Parameters

algo_name (str) – The name of the algorithm.

Returns

Whether the algorithm handles inequality constraints.

Return type

bool

cb_opt_constraints_snoptb(mode, nn_con, nn_jac, ne_jac, xn_vect, n_state)[source]

Evaluate the constraint functions and their gradient.

Use the snOpt conventions (from web.stanford.edu/group/SOL/guides/sndoc7.pdf).

Parameters
  • mode (int) – A flag that indicates whether the obj, the gradient or both must be assigned during the present call of function (0 ≤ mode ≤ 2). mode = 2, assign obj and the known components of gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only obj need be assigned; gradient is ignored.

  • nn_con (int) – The number of non-linear constraints.

  • nn_jac (int) – The number of dv involved in non-linear constraint functions.

  • ne_jac (int) – The number of non-zero elements in the constraints gradient. If dcstr is 2D, then ne_jac = nn_con*nn_jac.

  • xn_vect (numpy.ndarray) – The normalized design vector.

  • n_state (int) – An indicator for the first and last call to the current function n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.

Returns

The solution status, the evaluation of the constraint function and its gradient.

Return type

tuple[int, numpy.ndarray, numpy.ndarray]

cb_opt_objective_snoptb(mode, nn_obj, xn_vect, n_state=0)[source]

Evaluate the objective function and gradient.

Use the snOpt conventions for mode and status (from web.stanford.edu/group/SOL/guides/sndoc7.pdf).

Parameters
  • mode (int) – Flag to indicate whether the obj, the gradient or both must be assigned during the present call of the function (0 \(\leq\) mode \(\leq\) 2). mode = 2, assign the obj and the known components of the gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only the obj needs to be assigned; the gradient is ignored.

  • nn_obj (int) – The number of design variables.

  • xn_vect (numpy.ndarray) – The normalized design vector.

  • n_state (int) –

    An indicator for the first and last call to the current function. n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.

    By default it is set to 0.

Returns

The solution status, the evaluation of the objective function and its gradient.

Return type

tuple[int, numpy.ndarray, numpy.ndarray]

static cb_snopt_dummy_func(mode, nn_con, nn_jac, ne_jac, xn_vect, n_state)[source]

Return a dummy output for unconstrained problems.

Parameters
  • mode (int) – A flag that indicates whether the obj, the gradient or both must be assigned during the present call of function (0 ≤ mode ≤ 2). mode = 2, assign obj and the known components of gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only obj need be assigned; gradient is ignored.

  • nn_con (int) – The number of non-linear constraints.

  • nn_jac (int) – The number of dv involved in non-linear constraint functions.

  • ne_jac (int) – The number of non-zero elements in the constraints gradient. If dcstr is 2D, then ne_jac = nn_con*nn_jac.

  • xn_vect (numpy.ndarray) – The normalized design vector.

  • n_state (int) – An indicator for the first and last call to the current function n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.

Returns

A dummy output.

Return type

float

deactivate_progress_bar()

Deactivate the progress bar.

Return type

None

driver_has_option(option_name)

Check the existence of an option.

Parameters

option_name (str) – The name of the option.

Returns

Whether the option exists.

Return type

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
  • orig_func – The original function.

  • normalize

    Whether to use the normalized design space.

    By default it is set to True.

Returns

A function calling the original function with the input data projected onto the design space.

execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, **options)

Execute the driver.

Parameters
  • problem (OptimizationProblem) – The problem to be solved.

  • algo_name (str | None) –

    The name of the algorithm. If None, use the algo_name attribute which may have been set by the factory.

    By default it is set to None.

  • eval_obs_jac (bool) –

    Whether to evaluate the Jacobian of the observables.

    By default it is set to False.

  • skip_int_check (bool) –

    Whether to skip the integer variable handling check of the selected algorithm.

    By default it is set to False.

  • **options (DriverLibOptionType) – The options for the algorithm.

Returns

The optimization result.

Raises

ValueError – If algo_name was not either set by the factory or given as an argument.

Return type

OptimizationResult

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

Parameters

problem (Any) – The problem to be solved.

Returns

The names of the algorithms adapted to this problem.

Return type

list[str]

finalize_iter_observer()

Finalize the iteration observer.

Return type

None

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

get_right_sign_constraints()

Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – Whether to normalize the input variables that are not integers, according to the normalization policy of the design space.

Returns

The current value, the lower bounds and the upper bounds.

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
  • max_iter (int) – The maximum number of iterations.

  • message (str) – The message to display at the beginning.

Raises

ValueError – If the max_iter is not greater than or equal to one.

Return type

None

init_options_grammar(algo_name)

Initialize the options grammar.

Parameters

algo_name (str) – The name of the algorithm.

Return type

gemseo.core.grammars.json_grammar.JSONGrammar

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – The name of the algorithm.

is_algo_requires_positive_cstr(algo_name)

Check if an algorithm requires positive constraints.

Parameters

algo_name (str) – The name of the algorithm.

Returns

Whether the algorithm requires positive constraints.

Return type

bool

static is_algorithm_suited(algorithm_description, problem)

Check if the algorithm is suited to the problem according to its description.

Parameters
Returns

Whether the algorithm is suited to the problem.

Return type

bool

new_iteration_callback(x_vect=None)
Raises
  • FtolReached – If the defined relative or absolute function tolerance is reached.

  • XtolReached – If the defined relative or absolute x tolerance is reached.

Parameters

x_vect (ndarray | None) –

By default it is set to None.

Return type

None

COMPLEX_STEP_METHOD = 'complex_step'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
EVAL_OBS_JAC_OPTION = 'eval_obs_jac'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
INEQ_TOLERANCE = 'ineq_tolerance'
LIBRARY_NAME: ClassVar[str | None] = 'SNOPT'

The name of the interfaced library.

LIB_COMPUTE_GRAD = False
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
MESSAGES_DICT = {1: 'optimality conditions satisfied', 2: 'feasible point found', 3: 'requested accuracy could not be achieved', 11: 'infeasible linear constraints', 12: 'infeasible linear equalities', 13: 'nonlinear infeasibilities minimized', 14: 'infeasibilities minimized', 21: 'unbounded objective', 22: 'constraint violation limit reached', 31: 'iteration limit reached', 32: 'major iteration limit reached', 33: 'the superbasics limit is too small', 41: 'current point cannot be improved ', 42: 'singular basis', 43: 'cannot satisfy the general constraints', 44: 'ill-conditioned null-space basis', 51: 'incorrect objective derivatives', 52: 'incorrect constraint derivatives', 61: 'undefined function at the first feasible point', 62: 'undefined function at the initial point', 63: 'unable to proceed into undefined region', 72: 'terminated during constraint evaluation', 73: 'terminated during objective evaluation', 74: 'terminated from monitor routine', 81: 'work arrays must have at least 500 elements', 82: 'not enough character storage', 83: 'not enough integer storage', 84: 'not enough real storage', 91: 'invalid input argument', 92: 'basis file dimensions do not match this problem', 141: 'wrong number of basic variables', 142: 'error in basis package'}
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR: Final[str] = 'options'

The name of the directory containing the files of the grammars of the options.

OPTIONS_MAP: dict[str, str] = {'max_iter': 'Iteration_limit'}

The names of the options in GEMSEO mapping to those in the wrapped library.

PG_TOL = 'pg_tol'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
activate_progress_bar: ClassVar[bool] = True

Whether to activate the progress bar in the optimization log.

algo_name: str | None

The name of the algorithm used currently.

property algorithms: list[str]

The available algorithms.

descriptions: dict[str, AlgorithmDescription]

The description of the algorithms contained in the library.

internal_algo_name: str | None

The internal name of the algorithm used currently.

It typically corresponds to the name of the algorithm in the wrapped library if any.

opt_grammar: JSONGrammar | None

The grammar defining the options of the current algorithm.

problem: Any | None

The problem to be solved.