lib_snopt module¶
SNOPT optimization library wrapper.
Classes:

SNOPT optimization library interface. 
 class gemseo.algos.opt.lib_snopt.SnOpt[source]¶
Bases:
gemseo.algos.opt.opt_lib.OptimizationLibrary
SNOPT optimization library interface.
See OptimizationLibrary.
Constructor.
Generate the library dict, contains the list of algorithms with their characteristics:
does it require gradient
does it handle equality constraints
does it handle inequality constraints
Attributes:
The available algorithms.
Methods:
algorithm_handles_eqcstr
(algo_name)Returns True if the algorithms handles equality constraints.
algorithm_handles_ineqcstr
(algo_name)Returns True if the algorithms handles inequality constraints.
cb_opt_constraints_snoptb
(mode, nn_con, ...)Evaluate the constraint functions and their gradient.
cb_opt_objective_snoptb
(mode, nn_obj, xn_vect)Evaluate the objective function and gradient.
cb_snopt_dummy_func
(mode, nn_con, nn_jac, ...)Return a dummy output for unconstrained problems.
Deactivate the progress bar.
driver_has_option
(option_key)Check if the option key exists.
ensure_bounds
(orig_func[, normalize])Project the design vector onto the design space before execution.
execute
(problem[, algo_name])Executes the driver.
filter_adapted_algorithms
(problem)Filter the algorithms capable of solving the problem.
Finalize the iteration observer.
get_optimum_from_database
([message, status])Retrieves the optimum from the database and builds an optimization result object from it.
Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.
get_x0_and_bounds_vects
(normalize_ds)Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.
init_iter_observer
(max_iter, message)Initialize the iteration observer.
init_options_grammar
(algo_name)Initialize the options grammar.
is_algo_requires_grad
(algo_name)Returns True if the algorithm requires a gradient evaluation.
is_algo_requires_positive_cstr
(algo_name)Returns True if the algorithm requires positive constraints False otherwise.
is_algorithm_suited
(algo_dict, problem)Checks if the algorithm is suited to the problem according to its algo dict.
new_iteration_callback
([x_vect]) raises FtolReached
If the defined relative or absolute function tolerance is reached.
 COMPLEX_STEP_METHOD = 'complex_step'¶
 DESCRIPTION = 'description'¶
 DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']¶
 EQ_TOLERANCE = 'eq_tolerance'¶
 FINITE_DIFF_METHOD = 'finite_differences'¶
 F_TOL_ABS = 'ftol_abs'¶
 F_TOL_REL = 'ftol_rel'¶
 HANDLE_EQ_CONS = 'handle_equality_constraints'¶
 HANDLE_INEQ_CONS = 'handle_inequality_constraints'¶
 INEQ_TOLERANCE = 'ineq_tolerance'¶
 INTERNAL_NAME = 'internal_algo_name'¶
 LIB = 'lib'¶
 LIB_COMPUTE_GRAD = False¶
 LS_STEP_NB_MAX = 'max_ls_step_nb'¶
 LS_STEP_SIZE_MAX = 'max_ls_step_size'¶
 MAX_DS_SIZE_PRINT = 40¶
 MAX_FUN_EVAL = 'max_fun_eval'¶
 MAX_ITER = 'max_iter'¶
 MAX_TIME = 'max_time'¶
 MESSAGES_DICT = {1: 'optimality conditions satisfied', 2: 'feasible point found', 3: 'requested accuracy could not be achieved', 11: 'infeasible linear constraints', 12: 'infeasible linear equalities', 13: 'nonlinear infeasibilities minimized', 14: 'infeasibilities minimized', 21: 'unbounded objective', 22: 'constraint violation limit reached', 31: 'iteration limit reached', 32: 'major iteration limit reached', 33: 'the superbasics limit is too small', 41: 'current point cannot be improved ', 42: 'singular basis', 43: 'cannot satisfy the general constraints', 44: 'illconditioned nullspace basis', 51: 'incorrect objective derivatives', 52: 'incorrect constraint derivatives', 61: 'undefined function at the first feasible point', 62: 'undefined function at the initial point', 63: 'unable to proceed into undefined region', 72: 'terminated during constraint evaluation', 73: 'terminated during objective evaluation', 74: 'terminated from monitor routine', 81: 'work arrays must have at least 500 elements', 82: 'not enough character storage', 83: 'not enough integer storage', 84: 'not enough real storage', 91: 'invalid input argument', 92: 'basis file dimensions do not match this problem', 141: 'wrong number of basic variables', 142: 'error in basis package'}¶
 NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'¶
 OPTIONS_DIR = 'options'¶
 OPTIONS_MAP = {'max_iter': 'Iteration_limit'}¶
 PG_TOL = 'pg_tol'¶
 POSITIVE_CONSTRAINTS = 'positive_constraints'¶
 PROBLEM_TYPE = 'problem_type'¶
 REQUIRE_GRAD = 'require_grad'¶
 ROUND_INTS_OPTION = 'round_ints'¶
 STOP_CRIT_NX = 'stop_crit_n_x'¶
 USER_DEFINED_GRADIENT = 'user'¶
 USE_DATABASE_OPTION = 'use_database'¶
 VERBOSE = 'verbose'¶
 WEBSITE = 'website'¶
 X_TOL_ABS = 'xtol_abs'¶
 X_TOL_REL = 'xtol_rel'¶
 algorithm_handles_eqcstr(algo_name)¶
Returns True if the algorithms handles equality constraints.
 Parameters
algo_name – the name of the algorithm
 Returns
True or False
 algorithm_handles_ineqcstr(algo_name)¶
Returns True if the algorithms handles inequality constraints.
 Parameters
algo_name – the name of the algorithm
 Returns
True or False
 property algorithms¶
The available algorithms.
 cb_opt_constraints_snoptb(mode, nn_con, nn_jac, ne_jac, xn_vect, n_state)[source]¶
Evaluate the constraint functions and their gradient.
Use the snOpt conventions (from web.stanford.edu/group/SOL/guides/sndoc7.pdf).
 Parameters
mode (int) – A flag that indicates whether the obj, the gradient or both must be assigned during the present call of function (0 ≤ mode ≤ 2). mode = 2, assign obj and the known components of gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only obj need be assigned; gradient is ignored.
nn_con (int) – The number of nonlinear constraints.
nn_jac (int) – The number of dv involved in nonlinear constraint functions.
ne_jac (int) – The number of nonzero elements in the constraints gradient. If dcstr is 2D, then ne_jac = nn_con*nn_jac.
xn_vect (numpy.ndarray) – The normalized design vector.
n_state (int) – An indicator for the first and last call to the current function n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.
 Returns
 The solution status, the evaluation of the constraint function and
its gradient.
 Return type
Tuple[int, numpy.ndarray, numpy.ndarray]
 cb_opt_objective_snoptb(mode, nn_obj, xn_vect, n_state=0)[source]¶
Evaluate the objective function and gradient.
Use the snOpt conventions for mode and status (from web.stanford.edu/group/SOL/guides/sndoc7.pdf).
 Parameters
mode (int) – Flag to indicate whether the obj, the gradient or both must be assigned during the present call of the function (0 \(\leq\) mode \(\leq\) 2). mode = 2, assign the obj and the known components of the gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only the obj needs to be assigned; the gradient is ignored.
nn_obj (int) – The number of design variables.
xn_vect (numpy.ndarray) – The normalized design vector.
n_state (int) –
An indicator for the first and last call to the current function. n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.
By default it is set to 0.
 Returns
 The solution status, the evaluation of the objective function and its
gradient.
 Return type
Tuple[int, numpy.ndarray, numpy.ndarray]
 static cb_snopt_dummy_func(mode, nn_con, nn_jac, ne_jac, xn_vect, n_state)[source]¶
Return a dummy output for unconstrained problems.
 Parameters
mode (int) – A flag that indicates whether the obj, the gradient or both must be assigned during the present call of function (0 ≤ mode ≤ 2). mode = 2, assign obj and the known components of gradient. mode = 1, assign the known components of gradient. obj is ignored. mode = 0, only obj need be assigned; gradient is ignored.
nn_con (int) – The number of nonlinear constraints.
nn_jac (int) – The number of dv involved in nonlinear constraint functions.
ne_jac (int) – The number of nonzero elements in the constraints gradient. If dcstr is 2D, then ne_jac = nn_con*nn_jac.
xn_vect (numpy.ndarray) – The normalized design vector.
n_state (int) – An indicator for the first and last call to the current function n_state = 0: NTR. n_state = 1: first call to driver.cb_opt_objective_snoptb. n_state > 1, snOptB is calling subroutine for the last time and: n_state = 2 and the current x is optimal n_state = 3, the problem appears to be infeasible n_state = 4, the problem appears to be unbounded; n_state = 5, an iterations limit was reached.
 Returns
A dummy output.
 Return type
float
 deactivate_progress_bar()¶
Deactivate the progress bar.
 Return type
None
 driver_has_option(option_key)¶
Check if the option key exists.
 Parameters
option_key (str) – The name of the option.
 Returns
Whether the option is in the grammar.
 Return type
bool
 ensure_bounds(orig_func, normalize=True)¶
Project the design vector onto the design space before execution.
 Parameters
orig_func – the original function
normalize –
if True, use the normalized design space
By default it is set to True.
 Returns
the wrapped function
 execute(problem, algo_name=None, **options)¶
Executes the driver.
 Parameters
problem – the problem to be solved
algo_name –
name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)
By default it is set to None.
options – the options dict for the algorithm
 filter_adapted_algorithms(problem)¶
Filter the algorithms capable of solving the problem.
 Parameters
problem (Any) – The opt_problem to be solved.
 Returns
The list of adapted algorithms names.
 Return type
bool
 finalize_iter_observer()¶
Finalize the iteration observer.
 Return type
None
 get_optimum_from_database(message=None, status=None)¶
Retrieves the optimum from the database and builds an optimization result object from it.
 Parameters
message –
Default value = None)
By default it is set to None.
status –
Default value = None)
By default it is set to None.
 get_right_sign_constraints()¶
Transforms the problem constraints into their opposite sign counterpart if the algorithm requires positive constraints.
 get_x0_and_bounds_vects(normalize_ds)¶
Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.
 Parameters
normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy
 Returns
x, lower bounds, upper bounds
 init_iter_observer(max_iter, message)¶
Initialize the iteration observer.
It will handle the stopping criterion and the logging of the progress bar.
 Parameters
max_iter (int) – The maximum number of iterations.
message (str) – The message to display at the beginning.
 Raises
ValueError – If the max_iter is not greater than or equal to one.
 Return type
None
 init_options_grammar(algo_name)¶
Initialize the options grammar.
 Parameters
algo_name (str) – The name of the algorithm.
 Return type
 is_algo_requires_grad(algo_name)¶
Returns True if the algorithm requires a gradient evaluation.
 Parameters
algo_name – name of the algorithm
 is_algo_requires_positive_cstr(algo_name)¶
Returns True if the algorithm requires positive constraints False otherwise.
 Parameters
algo_name – the name of the algorithm
 Returns
True if constraints must be positive
 Return type
logical
 static is_algorithm_suited(algo_dict, problem)¶
Checks if the algorithm is suited to the problem according to its algo dict.
 Parameters
algo_dict – the algorithm characteristics
problem – the opt_problem to be solved
 new_iteration_callback(x_vect=None)¶
 Raises
FtolReached – If the defined relative or absolute function tolerance is reached.
XtolReached – If the defined relative or absolute x tolerance is reached.
 Parameters
x_vect (Optional[numpy.ndarray]) –
By default it is set to None.
 Return type
None