gemseo / algos

Show inherited members

driver_library module

Driver library.

A driver library aims to solve an OptimizationProblem using a particular algorithm from a particular family of numerical methods. This algorithm will be in charge of evaluating the objective and constraints functions at different points of the design space, using the DriverLibrary.execute() method. The most famous kinds of numerical methods to solve an optimization problem are optimization algorithms and design of experiments (DOE). A DOE driver browses the design space agnostically, i.e. without taking into account the function evaluations. On the contrary, an optimization algorithm uses this information to make the journey through design space as relevant as possible in order to reach as soon as possible the optimum. These families are implemented in DOELibrary and OptimizationLibrary.

class gemseo.algos.driver_library.DriverDescription(algorithm_name, internal_algorithm_name, library_name='', description='', website='', handle_integer_variables=False, require_gradient=False)[source]

Bases: AlgorithmDescription

The description of a driver.

Parameters:
  • algorithm_name (str)

  • internal_algorithm_name (str)

  • library_name (str) –

    By default it is set to “”.

  • description (str) –

    By default it is set to “”.

  • website (str) –

    By default it is set to “”.

  • handle_integer_variables (bool) –

    By default it is set to False.

  • require_gradient (bool) –

    By default it is set to False.

algorithm_name: str

The name of the algorithm in GEMSEO.

handle_integer_variables: bool = False

Whether the optimization algorithm handles integer variables.

internal_algorithm_name: str

The name of the algorithm in the wrapped library.

require_gradient: bool = False

Whether the optimization algorithm requires the gradient.

class gemseo.algos.driver_library.DriverLibrary[source]

Bases: AlgorithmLibrary

Abstract class for driver library interfaces.

Lists available methods in the library for the proposed problem to be solved.

To integrate an optimization package, inherit from this class and put your file in gemseo.algos.doe or gemseo.algo.opt packages.

class ApproximationMode(value)

Bases: StrEnum

The approximation derivation modes.

CENTERED_DIFFERENCES = 'centered_differences'

The centered differences method used to approximate the Jacobians by perturbing each variable with a small real number.

COMPLEX_STEP = 'complex_step'

The complex step method used to approximate the Jacobians by perturbing each variable with a small complex number.

FINITE_DIFFERENCES = 'finite_differences'

The finite differences method used to approximate the Jacobians by perturbing each variable with a small real number.

class DifferentiationMethod(value)

Bases: StrEnum

The differentiation methods.

CENTERED_DIFFERENCES = 'centered_differences'
COMPLEX_STEP = 'complex_step'
FINITE_DIFFERENCES = 'finite_differences'
USER_GRAD = 'user'
clear_listeners()[source]

Remove the listeners from the database.

Return type:

None

deactivate_progress_bar()[source]

Deactivate the progress bar.

Return type:

None

ensure_bounds(orig_func, normalize=True)[source]

Project the design vector onto the design space before execution.

Parameters:
  • orig_func – The original function.

  • normalize (bool) –

    Whether to use the normalized design space.

    By default it is set to True.

Returns:

A function calling the original function with the input data projected onto the design space.

execute(problem, algo_name=None, eval_obs_jac=False, skip_int_check=False, max_design_space_dimension_to_log=40, **options)[source]

Execute the driver.

Parameters:
  • problem (OptimizationProblem) – The problem to be solved.

  • algo_name (str | None) – The name of the algorithm. If None, use the algo_name attribute which may have been set by the factory.

  • eval_obs_jac (bool) –

    Whether to evaluate the Jacobian of the observables.

    By default it is set to False.

  • skip_int_check (bool) –

    Whether to skip the integer variable handling check of the selected algorithm.

    By default it is set to False.

  • max_design_space_dimension_to_log (int) –

    The maximum dimension of a design space to be logged. If this number is higher than the dimension of the design space then the design space will not be logged.

    By default it is set to 40.

  • **options (DriverLibraryOptionType) – The options for the algorithm.

Returns:

The optimization result.

Raises:

ValueError – If algo_name was not either set by the factory or given as an argument.

Return type:

OptimizationResult

finalize_iter_observer()[source]

Finalize the iteration observer.

Return type:

None

get_optimum_from_database(message=None, status=None)[source]

Return the optimization result from the database.

Return type:

OptimizationResult

get_x0_and_bounds(normalize_ds: bool, as_dict: Literal[False] = False) tuple[ndarray, ndarray, ndarray][source]
get_x0_and_bounds(normalize_ds: bool, as_dict: Literal[True] = False) tuple[dict[str, ndarray], dict[str, ndarray], dict[str, ndarray]]

Return the initial design variable values and their lower and upper bounds.

Parameters:
  • normalize_ds – Whether to normalize the design variables.

  • as_dict – Whether to return dictionaries instead of NumPy arrays.

Returns:

The initial values of the design variables, their lower bounds, and their upper bounds.

init_iter_observer(max_iter, message='')[source]

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters:
  • max_iter (int) – The maximum number of iterations.

  • message (str) –

    The message to display at the beginning of the progress bar status.

    By default it is set to “”.

Raises:

ValueError – If max_iter is lower than one.

Return type:

None

new_iteration_callback(x_vect)[source]

Iterate the progress bar, implement the stop criteria.

Parameters:

x_vect (ndarray) – The design variables values.

Raises:

MaxTimeReached – If the elapsed time is greater than the maximum execution time.

Return type:

None

requires_gradient(driver_name)[source]

Check if a driver requires the gradient.

Parameters:

driver_name (str) – The name of the driver.

Returns:

Whether the driver requires the gradient.

Return type:

bool

EQ_TOLERANCE = 'eq_tolerance'
EVAL_OBS_JAC_OPTION = 'eval_obs_jac'
INEQ_TOLERANCE = 'ineq_tolerance'
MAX_TIME = 'max_time'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
ROUND_INTS_OPTION = 'round_ints'
USE_DATABASE_OPTION = 'use_database'
USE_ONE_LINE_PROGRESS_BAR: ClassVar[bool] = False

Whether to use a one line progress bar.

activate_progress_bar: ClassVar[bool] = True

Whether to activate the progress bar in the optimization log.

algo_name: str | None

The name of the algorithm used currently.

descriptions: dict[str, AlgorithmDescription]

The description of the algorithms contained in the library.

internal_algo_name: str | None

The internal name of the algorithm used currently.

It typically corresponds to the name of the algorithm in the wrapped library if any.

option_grammar: JSONGrammar | None

The grammar defining the options of the current algorithm.

problem: OptimizationProblem

The optimization problem the driver library is bonded to.

Examples using DriverLibrary

Change the seed of a DOE

Change the seed of a DOE

Scaling

Scaling