lib_pdfo module¶
PDFO optimization library wrapper, see PDFO website.
- class gemseo_pdfo.lib_pdfo.PDFOAlgorithmDescription(algorithm_name, internal_algorithm_name, library_name='PDFO', description='', website='https://www.pdfo.net/', handle_integer_variables=False, require_gradient=False, handle_equality_constraints=False, handle_inequality_constraints=False, handle_multiobjective=False, positive_constraints=False, problem_type=ProblemType.NON_LINEAR)[source]¶
Bases:
OptimizationAlgorithmDescription
The description of an optimization algorithm from the PDFO library.
- Parameters:
algorithm_name (str) –
internal_algorithm_name (str) –
library_name (str) –
By default it is set to “PDFO”.
description (str) –
By default it is set to “”.
website (str) –
By default it is set to “https://www.pdfo.net/”.
handle_integer_variables (bool) –
By default it is set to False.
require_gradient (bool) –
By default it is set to False.
handle_equality_constraints (bool) –
By default it is set to False.
handle_inequality_constraints (bool) –
By default it is set to False.
handle_multiobjective (bool) –
By default it is set to False.
positive_constraints (bool) –
By default it is set to False.
problem_type (ProblemType) –
By default it is set to “non-linear”.
- handle_equality_constraints: bool = False¶
Whether the optimization algorithm handles equality constraints.
- handle_inequality_constraints: bool = False¶
Whether the optimization algorithm handles inequality constraints.
- handle_integer_variables: bool = False¶
Whether the optimization algorithm handles integer variables.
- handle_multiobjective: bool = False¶
Whether the optimization algorithm handles multiple objectives.
- positive_constraints: bool = False¶
Whether the optimization algorithm requires positive constraints.
- problem_type: OptimizationProblem.ProblemType = 'non-linear'¶
The type of problem (see
OptimizationProblem.ProblemType
).
- class gemseo_pdfo.lib_pdfo.PDFOOpt[source]¶
Bases:
OptimizationLibrary
PDFO optimization library interface.
See OptimizationLibrary.
Notes
The missing current values of the
DesignSpace
attached to theOptimizationProblem
are automatically initialized with the methodDesignSpace.initialize_missing_current_values()
.Constructor.
Generate the library dict, contains the list of algorithms with their characteristics:
does it require gradient
does it handle equality constraints
does it handle inequality constraints
- class ApproximationMode(value)¶
Bases:
StrEnum
The approximation derivation modes.
- CENTERED_DIFFERENCES = 'centered_differences'¶
The centered differences method used to approximate the Jacobians by perturbing each variable with a small real number.
- COMPLEX_STEP = 'complex_step'¶
The complex step method used to approximate the Jacobians by perturbing each variable with a small complex number.
- FINITE_DIFFERENCES = 'finite_differences'¶
The finite differences method used to approximate the Jacobians by perturbing each variable with a small real number.
- class DifferentiationMethod(value)¶
Bases:
StrEnum
The differentiation methods.
- CENTERED_DIFFERENCES = 'centered_differences'¶
- COMPLEX_STEP = 'complex_step'¶
- FINITE_DIFFERENCES = 'finite_differences'¶
- USER_GRAD = 'user'¶
- algorithm_handles_eqcstr(algo_name)¶
Check if an algorithm handles equality constraints.
- algorithm_handles_ineqcstr(algo_name)¶
Check if an algorithm handles inequality constraints.
- deactivate_progress_bar()¶
Deactivate the progress bar.
- Return type:
None
- driver_has_option(option_name)¶
Check the existence of an option.
- ensure_bounds(orig_func, normalize=True)¶
Project the design vector onto the design space before execution.
- Parameters:
orig_func – The original function.
normalize (bool) –
Whether to use the normalized design space.
By default it is set to True.
- Returns:
A function calling the original function with the input data projected onto the design space.
- execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, **options)¶
Execute the driver.
- Parameters:
problem (OptimizationProblem) – The problem to be solved.
algo_name (str | None) – The name of the algorithm. If
None
, use the algo_name attribute which may have been set by the factory.eval_obs_jac (bool) –
Whether to evaluate the Jacobian of the observables.
By default it is set to False.
skip_int_check (bool) –
Whether to skip the integer variable handling check of the selected algorithm.
By default it is set to False.
**options (DriverLibOptionType) – The options for the algorithm.
- Returns:
The optimization result.
- Raises:
ValueError – If algo_name was not either set by the factory or given as an argument.
- Return type:
- filter_adapted_algorithms(problem)¶
Filter the algorithms capable of solving the problem.
- finalize_iter_observer()¶
Finalize the iteration observer.
- Return type:
None
- get_optimum_from_database(message=None, status=None)¶
Return the optimization result from the database.
- Return type:
- get_right_sign_constraints()¶
Transform the problem constraints into their opposite sign counterpart.
This is done if the algorithm requires positive constraints.
- get_x0_and_bounds_vects(normalize_ds, as_dict=False)¶
Return the initial design variable values and their lower and upper bounds.
- init_iter_observer(max_iter, message='')¶
Initialize the iteration observer.
It will handle the stopping criterion and the logging of the progress bar.
- Parameters:
- Raises:
ValueError – If
max_iter
is lower than one.- Return type:
None
- init_options_grammar(algo_name)¶
Initialize the options’ grammar.
- Parameters:
algo_name (str) – The name of the algorithm.
- Return type:
- is_algo_requires_positive_cstr(algo_name)¶
Check if an algorithm requires positive constraints.
- classmethod is_algorithm_suited(algorithm_description, problem)¶
Check if an algorithm is suited to a problem according to its description.
- Parameters:
algorithm_description (AlgorithmDescription) – The description of the algorithm.
problem (Any) – The problem to be solved.
- Returns:
Whether the algorithm is suited to the problem.
- Return type:
- new_iteration_callback(x_vect)¶
Verify the design variable and objective value stopping criteria.
- Parameters:
x_vect (ndarray) – The design variables values.
- Raises:
FtolReached – If the defined relative or absolute function tolerance is reached.
XtolReached – If the defined relative or absolute x tolerance is reached.
- Return type:
None
- requires_gradient(driver_name)¶
Check if a driver requires the gradient.
- EQ_TOLERANCE = 'eq_tolerance'¶
- EVAL_OBS_JAC_OPTION = 'eval_obs_jac'¶
- F_TOL_ABS = 'ftol_abs'¶
- F_TOL_REL = 'ftol_rel'¶
- INEQ_TOLERANCE = 'ineq_tolerance'¶
- LIB_COMPUTE_GRAD = False¶
- LS_STEP_NB_MAX = 'max_ls_step_nb'¶
- LS_STEP_SIZE_MAX = 'max_ls_step_size'¶
- MAX_DS_SIZE_PRINT = 40¶
- MAX_FUN_EVAL = 'max_fun_eval'¶
- MAX_ITER = 'max_iter'¶
- MAX_TIME = 'max_time'¶
- NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'¶
- OPTIONS_DIR: ClassVar[str | Path] = 'options'¶
The name of the directory containing the files of the grammars of the options.
- OPTIONS_MAP: ClassVar[dict[str, str]] = {'max_iter': 'max_iter'}¶
The names of the options in GEMSEO mapping to those in the wrapped library.
- PG_TOL = 'pg_tol'¶
- ROUND_INTS_OPTION = 'round_ints'¶
- STOP_CRIT_NX = 'stop_crit_n_x'¶
- USE_DATABASE_OPTION = 'use_database'¶
- VERBOSE = 'verbose'¶
- X_TOL_ABS = 'xtol_abs'¶
- X_TOL_REL = 'xtol_rel'¶
- activate_progress_bar: ClassVar[bool] = True¶
Whether to activate the progress bar in the optimization log.
- descriptions: dict[str, AlgorithmDescription]¶
The description of the algorithms contained in the library.
- internal_algo_name: str | None¶
The internal name of the algorithm used currently.
It typically corresponds to the name of the algorithm in the wrapped library if any.
- opt_grammar: JSONGrammar | None¶
The grammar defining the options of the current algorithm.
- problem: OptimizationProblem¶
The optimization problem the driver library is bonded to.