gemseo / algos / opt

lib_pseven module

Wrapper for the Generic Tool for Optimization (GTOpt) of pSeven Core.

class gemseo.algos.opt.lib_pseven.PSevenAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='pSeven', description='', website='https://datadvance.net/product/pseven/manual/', handle_integer_variables=False, require_gradient=False, handle_equality_constraints=False, handle_inequality_constraints=False, handle_multiobjective=False, positive_constraints=False, problem_type='non-linear')[source]

Bases: OptimizationAlgorithmDescription

The description of an optimization algorithm from the NLopt library.

Parameters:
  • algorithm_name (str) –

  • internal_algorithm_name (str) –

  • library_name (str) –

    By default it is set to “pSeven”.

  • description (str) –

    By default it is set to “”.

  • website (str) –

    By default it is set to “https://datadvance.net/product/pseven/manual/”.

  • handle_integer_variables (bool) –

    By default it is set to False.

  • require_gradient (bool) –

    By default it is set to False.

  • handle_equality_constraints (bool) –

    By default it is set to False.

  • handle_inequality_constraints (bool) –

    By default it is set to False.

  • handle_multiobjective (bool) –

    By default it is set to False.

  • positive_constraints (bool) –

    By default it is set to False.

  • problem_type (str) –

    By default it is set to “non-linear”.

algorithm_name: str

The name of the algorithm in GEMSEO.

description: str = ''

A description of the algorithm.

handle_equality_constraints: bool = False

Whether the optimization algorithm handles equality constraints.

handle_inequality_constraints: bool = False

Whether the optimization algorithm handles inequality constraints.

handle_integer_variables: bool = False

Whether the optimization algorithm handles integer variables.

handle_multiobjective: bool = False

Whether the optimization algorithm handles multiple objectives.

internal_algorithm_name: str

The name of the algorithm in the wrapped library.

library_name: str = 'pSeven'

The name of the wrapped library.

positive_constraints: bool = False

Whether the optimization algorithm requires positive constraints.

problem_type: str = 'non-linear'

The type of problem (see OptimizationProblem.AVAILABLE_PB_TYPES).

require_gradient: bool = False

Whether the optimization algorithm requires the gradient.

website: str = 'https://datadvance.net/product/pseven/manual/'

The website of the wrapped library or algorithm.

class gemseo.algos.opt.lib_pseven.PSevenOpt[source]

Bases: OptimizationLibrary

Interface for the Generic Tool for Optimization (GTOpt) of pSeven Core.

Note

The missing current values of the DesignSpace attached to the OptimizationProblem are automatically initialized with the method DesignSpace.initialize_missing_current_values().

algorithm_handles_eqcstr(algo_name)

Check if an algorithm handles equality constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm handles equality constraints.

Return type:

bool

algorithm_handles_ineqcstr(algo_name)

Check if an algorithm handles inequality constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm handles inequality constraints.

Return type:

bool

deactivate_progress_bar()

Deactivate the progress bar.

Return type:

None

driver_has_option(option_name)

Check the existence of an option.

Parameters:

option_name (str) – The name of the option.

Returns:

Whether the option exists.

Return type:

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters:
  • orig_func – The original function.

  • normalize

    Whether to use the normalized design space.

    By default it is set to True.

Returns:

A function calling the original function with the input data projected onto the design space.

execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, **options)

Execute the driver.

Parameters:
  • problem (OptimizationProblem) – The problem to be solved.

  • algo_name (str | None) – The name of the algorithm. If None, use the algo_name attribute which may have been set by the factory.

  • eval_obs_jac (bool) –

    Whether to evaluate the Jacobian of the observables.

    By default it is set to False.

  • skip_int_check (bool) –

    Whether to skip the integer variable handling check of the selected algorithm.

    By default it is set to False.

  • **options (DriverLibOptionType) – The options for the algorithm.

Returns:

The optimization result.

Raises:

ValueError – If algo_name was not either set by the factory or given as an argument.

Return type:

OptimizationResult

filter_adapted_algorithms(problem)

Filter the algorithms capable of solving the problem.

Parameters:

problem (Any) – The problem to be solved.

Returns:

The names of the algorithms adapted to this problem.

Return type:

list[str]

finalize_iter_observer()

Finalize the iteration observer.

Return type:

None

get_optimum_from_database(message=None, status=None)

Retrieve the optimum from the database and build an optimization.

get_right_sign_constraints()

Transform the problem constraints into their opposite sign counterpart.

This is done if the algorithm requires positive constraints.

get_x0_and_bounds_vects(normalize_ds)

Return x0 and bounds.

Parameters:

normalize_ds – Whether to normalize the input variables that are not integers, according to the normalization policy of the design space.

Returns:

The current value, the lower bounds and the upper bounds.

init_iter_observer(max_iter, message='...')

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters:
  • max_iter (int) – The maximum number of iterations.

  • message (str) –

    The message to display at the beginning.

    By default it is set to “…”.

Raises:

ValueError – If max_iter is lower than one.

Return type:

None

init_options_grammar(algo_name)

Initialize the options’ grammar.

Parameters:

algo_name (str) – The name of the algorithm.

Return type:

JSONGrammar

is_algo_requires_grad(algo_name)

Returns True if the algorithm requires a gradient evaluation.

Parameters:

algo_name – The name of the algorithm.

is_algo_requires_positive_cstr(algo_name)

Check if an algorithm requires positive constraints.

Parameters:

algo_name (str) – The name of the algorithm.

Returns:

Whether the algorithm requires positive constraints.

Return type:

bool

classmethod is_algorithm_suited(algorithm_description, problem)

Check if an algorithm is suited to a problem according to its description.

Parameters:
  • algorithm_description (AlgorithmDescription) – The description of the algorithm.

  • problem (Any) – The problem to be solved.

Returns:

Whether the algorithm is suited to the problem.

Return type:

bool

new_iteration_callback(x_vect=None)

Verify the design variable and objective value stopping criteria.

Parameters:

x_vect (ndarray | None) – The design variables values. If None, use the values of the last iteration.

Raises:
  • FtolReached – If the defined relative or absolute function tolerance is reached.

  • XtolReached – If the defined relative or absolute x tolerance is reached.

Return type:

None

COMPLEX_STEP_METHOD = 'complex_step'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
EQ_TOLERANCE = 'eq_tolerance'
EVAL_OBS_JAC_OPTION = 'eval_obs_jac'
FINITE_DIFF_METHOD = 'finite_differences'
F_TOL_ABS = 'ftol_abs'
F_TOL_REL = 'ftol_rel'
INEQ_TOLERANCE = 'ineq_tolerance'
LIBRARY_NAME: ClassVar[str | None] = 'pSeven'

The name of the interfaced library.

LIB_COMPUTE_GRAD = True
LS_STEP_NB_MAX = 'max_ls_step_nb'
LS_STEP_SIZE_MAX = 'max_ls_step_size'
MAX_DS_SIZE_PRINT = 40
MAX_FUN_EVAL = 'max_fun_eval'
MAX_ITER = 'max_iter'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
OPTIONS_DIR: Final[str] = 'options'

The name of the directory containing the files of the grammars of the options.

OPTIONS_MAP: dict[str, str] = {'constraints_smoothness': 'GTOpt/ConstraintsSmoothness', 'detect_nan_clusters': 'GTOpt/DetectNaNClusters', 'deterministic': 'GTOpt/Deterministic', 'diff_scheme': 'GTOpt/DiffScheme', 'diff_step': 'GTOpt/NumDiffStepSize', 'diff_type': 'GTOpt/DiffType', 'ensure_feasibility': 'GTOpt/EnsureFeasibility', 'global_phase_intensity': 'GTOpt/GlobalPhaseIntensity', 'local_search': 'GTOpt/LocalSearch', 'log_level': 'GTOpt/LogLevel', 'max_batch_size': 'GTOpt/BatchSize', 'max_expensive_func_iter': 'GTOpt/MaximumExpensiveIterations', 'max_func_iter': 'GTOpt/MaximumIterations', 'max_threads': 'GTOpt/MaxParallel', 'objectives_smoothness': 'GTOpt/ObjectivesSmoothness', 'responses_scalability': 'GTOpt/ResponsesScalability', 'restore_analytic_func': 'GTOpt/RestoreAnalyticResponses', 'seed': 'GTOpt/Seed', 'time_limit': 'GTOpt/TimeLimit', 'verbose_log': 'GTOpt/VerboseOutput'}

The names of the options in GEMSEO mapping to those in the wrapped library.

PG_TOL = 'pg_tol'
ROUND_INTS_OPTION = 'round_ints'
STOP_CRIT_NX = 'stop_crit_n_x'
USER_DEFINED_GRADIENT = 'user'
USE_DATABASE_OPTION = 'use_database'
VERBOSE = 'verbose'
X_TOL_ABS = 'xtol_abs'
X_TOL_REL = 'xtol_rel'
activate_progress_bar: ClassVar[bool] = True

Whether to activate the progress bar in the optimization log.

algo_name: str | None

The name of the algorithm used currently.

property algorithms: list[str]

The available algorithms.

descriptions: dict[str, AlgorithmDescription]

The description of the algorithms contained in the library.

internal_algo_name: str | None

The internal name of the algorithm used currently.

It typically corresponds to the name of the algorithm in the wrapped library if any.

opt_grammar: JSONGrammar | None

The grammar defining the options of the current algorithm.

problem: Any | None

The problem to be solved.