gemseo / algos / doe

# lib_scalable module¶

Build a diagonal DOE for scalable model construction.

Classes:

 Class used to create a diagonal DOE.
class gemseo.algos.doe.lib_scalable.DiagonalDOE[source]

Class used to create a diagonal DOE.

Return type

None

lib_dict

The properties of the algorithm in terms of requirements on the properties of the problem to be solved.

Type

Dict[str, Dict[str, Union[str, bool]]]

algo_name

The name of the algorithm used currently.

Type

str

internal_algo_name

The name of the algorithm used currently as defined in the used library.

Type

str

problem

The problem to be solved.

Type

Any

Constructor Abstract class.

Attributes:

Methods:

 compute_doe(variables_space[, size, ...]) Compute a design of experiments (DOE) in a variables space. compute_phip_criteria(samples[, power]) Compute the $$\phi^p$$ space-filling criterion. Deactivate the progress bar. driver_has_option(option_key) Check if the option key exists. ensure_bounds(orig_func[, normalize]) Project the design vector onto the design space before execution. evaluate_samples([eval_jac, n_processes, ...]) Evaluate all the functions of the optimization problem at the samples. execute(problem[, algo_name]) Executes the driver. export_samples(doe_output_file) Export samples generated by DOE library to a csv file. filter_adapted_algorithms(problem) Filter the algorithms capable of solving the problem. Finalize the iteration observer. get_optimum_from_database([message, status]) Retrieves the optimum from the database and builds an optimization result object from it. get_x0_and_bounds_vects(normalize_ds) Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays. init_iter_observer(max_iter, message) Initialize the iteration observer. init_options_grammar(algo_name) Initialize the options grammar. is_algo_requires_grad(algo_name) Returns True if the algorithm requires a gradient evaluation. is_algorithm_suited(algo_dict, problem) Checks if the algorithm is suited to the problem according to its algo dict. new_iteration_callback([x_vect]) Callback called at each new iteration, i.e. every time a design vector that is not already in the database is proposed by the optimizer.
COMPLEX_STEP_METHOD = 'complex_step'
DESCRIPTION = 'description'
DESIGN_ALGO_NAME = 'Design algorithm'
DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences']
DIMENSION = 'dimension'
EQ_TOLERANCE = 'eq_tolerance'
EVAL_JAC = 'eval_jac'
FINITE_DIFF_METHOD = 'finite_differences'
HANDLE_EQ_CONS = 'handle_equality_constraints'
HANDLE_INEQ_CONS = 'handle_inequality_constraints'
INEQ_TOLERANCE = 'ineq_tolerance'
INTERNAL_NAME = 'internal_algo_name'
LEVEL_KEYWORD = 'levels'
LIB = 'lib'
MAX_DS_SIZE_PRINT = 40
MAX_TIME = 'max_time'
MIN_DIMS = 'min_dims'
NORMALIZE_DESIGN_SPACE_OPTION = 'normalize_design_space'
N_PROCESSES = 'n_processes'
N_SAMPLES = 'n_samples'
OPTIONS_DIR = 'options'
OPTIONS_MAP = {}
PHIP_CRITERIA = 'phi^p'
POSITIVE_CONSTRAINTS = 'positive_constraints'
PROBLEM_TYPE = 'problem_type'
ROUND_INTS_OPTION = 'round_ints'
SAMPLES_TAG = 'samples'
SEED = 'seed'
USE_DATABASE_OPTION = 'use_database'
WAIT_TIME_BETWEEN_SAMPLES = 'wait_time_between_samples'
WEBSITE = 'website'
property algorithms

The available algorithms.

compute_doe(variables_space, size=None, unit_sampling=False, **options)

Compute a design of experiments (DOE) in a variables space.

Parameters
• variables_space (gemseo.algos.design_space.DesignSpace) – The variables space to be sampled.

• size (Optional[int]) –

The size of the DOE. If None, the size is deduced from the options.

By default it is set to None.

• unit_sampling (bool) –

Whether to sample in the unit hypercube.

By default it is set to False.

• **options (Union[str, float, int, bool, List[str], numpy.ndarray]) – The options of the DOE algorithm.

Returns

The design of experiments whose rows are the samples and columns the variables.

Return type

numpy.ndarray

static compute_phip_criteria(samples, power=10.0)

Compute the $$\phi^p$$ space-filling criterion.

See Morris & Mitchell, Exploratory designs for computational experiments, 1995.

Parameters
• samples – design variables list

• power

The power p of the $$\phi^p$$ criteria.

By default it is set to 10.0.

deactivate_progress_bar()

Deactivate the progress bar.

Return type

None

driver_has_option(option_key)

Check if the option key exists.

Parameters

option_key (str) – The name of the option.

Returns

Whether the option is in the grammar.

Return type

bool

ensure_bounds(orig_func, normalize=True)

Project the design vector onto the design space before execution.

Parameters
• orig_func – the original function

• normalize

if True, use the normalized design space

By default it is set to True.

Returns

the wrapped function

evaluate_samples(eval_jac=False, n_processes=1, wait_time_between_samples=0.0)

Evaluate all the functions of the optimization problem at the samples.

Parameters
• eval_jac

Whether to evaluate the jacobian.

By default it is set to False.

• n_processes

The number of processes used to evaluate the samples.

By default it is set to 1.

• wait_time_between_samples

The time to wait between each sample evaluation, in seconds.

By default it is set to 0.0.

execute(problem, algo_name=None, **options)

Executes the driver.

Parameters
• problem – the problem to be solved

• algo_name

name of the algorithm if None, use self.algo_name which may have been set by the factory (Default value = None)

By default it is set to None.

• options – the options dict for the algorithm

export_samples(doe_output_file)

Export samples generated by DOE library to a csv file.

Parameters

doe_output_file (string) – export file name

Filter the algorithms capable of solving the problem.

Parameters

problem (Any) – The opt_problem to be solved.

Returns

The list of adapted algorithms names.

Return type

bool

finalize_iter_observer()

Finalize the iteration observer.

Return type

None

get_optimum_from_database(message=None, status=None)

Retrieves the optimum from the database and builds an optimization result object from it.

Parameters
• message

Default value = None)

By default it is set to None.

• status

Default value = None)

By default it is set to None.

get_x0_and_bounds_vects(normalize_ds)

Gets x0, bounds, normalized or not depending on algo options, all as numpy arrays.

Parameters

normalize_ds – if True, normalizes all input vars that are not integers, according to design space normalization policy

Returns

x, lower bounds, upper bounds

init_iter_observer(max_iter, message)

Initialize the iteration observer.

It will handle the stopping criterion and the logging of the progress bar.

Parameters
• max_iter (int) – The maximum number of iterations.

• message (str) – The message to display at the beginning.

Raises

ValueError – If the max_iter is not greater than or equal to one.

Return type

None

init_options_grammar(algo_name)

Initialize the options grammar.

Parameters

algo_name (str) – The name of the algorithm.

Return type

gemseo.core.grammars.json_grammar.JSONGrammar

Returns True if the algorithm requires a gradient evaluation.

Parameters

algo_name – name of the algorithm

static is_algorithm_suited(algo_dict, problem)

Checks if the algorithm is suited to the problem according to its algo dict.

Parameters
• algo_dict – the algorithm characteristics

• problem – the opt_problem to be solved

new_iteration_callback(x_vect=None)

Callback called at each new iteration, i.e. every time a design vector that is not already in the database is proposed by the optimizer.

Iterate the progress bar, implement the stop criteria.

Parameters

x_vect (Optional[numpy.ndarray]) –

The design variables values. If None, use the values of the last iteration.

By default it is set to None.

Raises

MaxTimeReached – If the elapsed time is greater than the maximum execution time.

Return type

None