rastrigin module¶
The Rastrigin analytic problem¶
Classes:
Rastrigin |
- class gemseo.problems.analytical.rastrigin.Rastrigin[source]¶
Bases:
gemseo.algos.opt_problem.OptimizationProblem
Rastrigin
OptimizationProblem
uses the Rastrigin objective function with theDesignSpace
\([-0.1,0.1]^2\)From http://en.wikipedia.org/wiki/Rastrigin_function:
the Rastrigin function is a non-convex function used as a performance test problem for optimization algorithms. It is a typical example of non-linear multimodal function. It was first proposed by [Rastrigin] as a 2-dimensional function and has been generalized by [MuhlenbeinEtAl]. Finding the minimum of this function is a fairly difficult problem due to its large search space and its large number of local minima. It has a global minimum at \(x=0\) where \(f(x)=0\). It can be extended to \(n>2\) dimensions:
\[f(x) = 10n + \sum_{i=1}^n [x_i^2 - 10\cos(2\pi x_i)]\][Rastrigin] Rastrigin, L. A. “Systems of extremal control.” Mir, Moscow (1974).
[MuhlenbeinEtAl] H. Mühlenbein, D. Schomisch and J. Born. “The Parallel Genetic Algorithm as Function Optimizer “. Parallel Computing, 17, pages 619–632, 1991.
The constructor initializes the Rastrigin
OptimizationProblem
by defining theDesignSpace
and the objective function.Attributes:
The differentiation method.
The dimension of the design space.
Whether the optimization problem is mono-objective.
The objective function.
Whether to approximate the derivatives in parallel.
The options to approximate the derivatives in parallel.
Methods:
add_callback
(callback_func[, each_new_iter, ...])Add a callback function after each store operation or new iteration.
add_constraint
(cstr_func[, value, ...])Add a constraint (equality and inequality) to the optimization problem.
add_eq_constraint
(cstr_func[, value])Add an equality constraint to the optimization problem.
add_ineq_constraint
(cstr_func[, value, positive])Add an inequality constraint to the optimization problem.
add_observable
(obs_func[, new_iter])Add a function to be observed.
aggregate_constraint
(constr_id[, method, groups])Aggregates a constraint to generate a reduced dimension constraint.
Change the objective function sign in order to minimize its opposite.
check
()Check if the optimization problem is ready for run.
check_format
(input_function)Check that a function is an instance of
MDOFunction
.Clear all the listeners.
evaluate_functions
([x_vect, eval_jac, ...])Compute the objective and the constraints.
execute_observables_callback
(last_x)The callback function to be passed to the database.
export_hdf
(file_path[, append])Export the optimization problem to an HDF file.
export_to_dataset
([name, by_group, ...])Export the database of the optimization problem to a
Dataset
.get_active_ineq_constraints
(x_vect[, tol])For each constraint, indicate if its different components are active.
Retrieve all the functions of the optimization problem.
Retrieve the names of all the function of the optimization problem.
Retrieve the best infeasible point within a given tolerance.
Retrieve the names of the constraints.
Retrieve the number of constraints.
get_data_by_names
(names[, as_dict, ...])Return the data for specific names of variables.
Retrieve the names of the design variables.
Retrieve the total number of design variables.
Retrieve all the equality constraints.
Retrieve the number of equality constraints.
Retrieve the total dimension of the equality constraints.
Retrieve the feasible points within a given tolerance.
Return the dimensions of the outputs of the problem functions.
Retrieve all the inequality constraints.
Retrieve the number of inequality constraints.
Retrieve the total dimension of the inequality constraints.
Retrieve the non-processed constraints.
Retrieve the non-processed objective function.
Return the number of scalar constraints not satisfied by design variables.
Retrieve the name of the objective function.
get_observable
(name)Retrieve an observable from its name.
Return the optimum solution within a given feasibility tolerances.
Return the names of the scalar constraints.
Return theoretical optimal value of Rastrigin function.
get_violation_criteria
(x_vect)Compute a violation measure associated to an iteration.
Return the current values of the design variables after normalization.
Check if the problem has equality or inequality constraints.
Check if the problem has equality constraints.
Check if the problem has inequality constraints.
Check if the problem has non-linear constraints.
import_hdf
(file_path[, x_tolerance])Import an optimization history from an HDF file.
Check if the maximum amount of iterations has been reached.
is_point_feasible
(out_val[, constraints])Check if a point is feasible.
preprocess_functions
([normalize, ...])Pre-process all the functions and eventually the gradient.
rastrigin
(x_dv)This function computes the order n=2 Rastrigin function.
rastrigin_jac
(x_dv)This function computes the analytical gradient of 2nd order Rastrigin function.
repr_constraint
(func, ctype[, value, positive])Express a constraint as a string expression.
- AVAILABLE_PB_TYPES = ['linear', 'non-linear']¶
- COMPLEX_STEP = 'complex_step'¶
- CONSTRAINTS_GROUP = 'constraints'¶
- DESIGN_SPACE_ATTRS = ['u_bounds', 'l_bounds', 'x_0', 'x_names', 'dimension']¶
- DESIGN_SPACE_GROUP = 'design_space'¶
- DESIGN_VAR_NAMES = 'x_names'¶
- DESIGN_VAR_SIZE = 'x_size'¶
- DIFFERENTIATION_METHODS = ['user', 'complex_step', 'finite_differences', 'no_derivatives']¶
- FINITE_DIFFERENCES = 'finite_differences'¶
- FUNCTIONS_ATTRS = ['objective', 'constraints']¶
- GGOBI_FORMAT = 'ggobi'¶
- HDF5_FORMAT = 'hdf5'¶
- LINEAR_PB = 'linear'¶
- NON_LINEAR_PB = 'non-linear'¶
- NO_DERIVATIVES = 'no_derivatives'¶
- OBJECTIVE_GROUP = 'objective'¶
- OPTIM_DESCRIPTION = ['minimize_objective', 'fd_step', 'differentiation_method', 'pb_type', 'ineq_tolerance', 'eq_tolerance']¶
- OPT_DESCR_GROUP = 'opt_description'¶
- SOLUTION_GROUP = 'solution'¶
- USER_GRAD = 'user'¶
- add_callback(callback_func, each_new_iter=True, each_store=False)¶
Add a callback function after each store operation or new iteration.
- Parameters
callback_func (Callable) – A function to be called after some event.
each_new_iter (bool) –
If True, then callback at every iteration.
By default it is set to True.
each_store (bool) –
If True, then callback at every call to
Database.store
.By default it is set to False.
- Return type
None
- add_constraint(cstr_func, value=None, cstr_type=None, positive=False)¶
Add a constraint (equality and inequality) to the optimization problem.
- Parameters
cstr_func (MDOFunction) – The constraint.
value (Optional[value]) –
The value for which the constraint is active. If None, this value is 0.
By default it is set to None.
cstr_type (Optional[str]) –
The type of the constraint. Either equality or inequality.
By default it is set to None.
positive (bool) –
If True, then the inequality constraint is positive.
By default it is set to False.
- Raises
TypeError – When the constraint of a linear optimization problem is not an
MDOLinearFunction
.ValueError – When the type of the constraint is missing.
- Return type
None
- add_eq_constraint(cstr_func, value=None)¶
Add an equality constraint to the optimization problem.
- Parameters
cstr_func (gemseo.core.mdofunctions.mdo_function.MDOFunction) – The constraint.
value (Optional[float]) –
The value for which the constraint is active. If None, this value is 0.
By default it is set to None.
- Return type
None
- add_ineq_constraint(cstr_func, value=None, positive=False)¶
Add an inequality constraint to the optimization problem.
- Parameters
cstr_func (MDOFunction) – The constraint.
value (Optional[value]) –
The value for which the constraint is active. If None, this value is 0.
By default it is set to None.
positive (bool) –
If True, then the inequality constraint is positive.
By default it is set to False.
- Return type
None
- add_observable(obs_func, new_iter=True)¶
Add a function to be observed.
- Parameters
obs_func (gemseo.core.mdofunctions.mdo_function.MDOFunction) – An observable to be observed.
new_iter (bool) –
If True, then the observable will be called at each new iterate.
By default it is set to True.
- Return type
None
- aggregate_constraint(constr_id, method='max', groups=None, **options)¶
Aggregates a constraint to generate a reduced dimension constraint.
- Parameters
constr_id (int) – index of the constraint in self.constraints
method (str or callable, that takes a function and returns a function) –
aggregation method, among (‘max’,’KS’, ‘IKS’)
By default it is set to max.
groups (tuple of ndarray) –
if None, a single output constraint is produced otherwise, one output per group is produced.
By default it is set to None.
- change_objective_sign()¶
Change the objective function sign in order to minimize its opposite.
The
OptimizationProblem
expresses any optimization problem as a minimization problem. Then, an objective function originally expressed as a performance function to maximize must be converted into a cost function to minimize, by means of this method.- Return type
None
- check()¶
Check if the optimization problem is ready for run.
- Raises
ValueError – If the objective function is missing.
- Return type
None
- static check_format(input_function)¶
Check that a function is an instance of
MDOFunction
.- Parameters
input_function – The function to be tested.
- Raises
TypeError – If the function is not a
MDOFunction
.- Return type
None
- clear_listeners()¶
Clear all the listeners.
- Return type
None
- property differentiation_method¶
The differentiation method.
- property dimension¶
The dimension of the design space.
- evaluate_functions(x_vect=None, eval_jac=False, eval_obj=True, normalize=True, no_db_no_norm=False)¶
Compute the objective and the constraints.
Some optimization libraries require the number of constraints as an input parameter which is unknown by the formulation or the scenario. Evaluation of initial point allows to get this mandatory information. This is also used for design of experiments to evaluate samples.
- Parameters
x_vect (Optional[numpy.ndarray]) –
The input vector at which the functions must be evaluated; if None, x_0 is used.
By default it is set to None.
eval_jac (bool) –
If True, then the Jacobian is evaluated
By default it is set to False.
eval_obj (bool) –
If True, then the objective function is evaluated
By default it is set to True.
normalize (bool) –
If True, then input vector is considered normalized
By default it is set to True.
no_db_no_norm (bool) –
If True, then do not use the pre-processed functions, so we have no database, nor normalization.
By default it is set to False.
- Returns
The functions values and/or the Jacobian values according to the passed arguments.
- Raises
ValueError – If both no_db_no_norm and normalize are True.
- Return type
Tuple[Dict[str, Union[float, numpy.ndarray]], Dict[str, numpy.ndarray]]
- execute_observables_callback(last_x)¶
The callback function to be passed to the database.
Call all the observables with the last design variables values as argument.
- Parameters
last_x (numpy.ndarray) – The design variables values from the last evaluation.
- Return type
None
- export_hdf(file_path, append=False)¶
Export the optimization problem to an HDF file.
- Parameters
file_path (str) – The file to store the data.
append (bool) –
If True, then the data are appended to the file if not empty.
By default it is set to False.
- Return type
None
- export_to_dataset(name=None, by_group=True, categorize=True, opt_naming=True, export_gradients=False)¶
Export the database of the optimization problem to a
Dataset
.The variables can be classified into groups, separating the design variables and functions (objective function and constraints). This classification can use either an optimization naming, with
Database.DESIGN_GROUP
andDatabase.FUNCTION_GROUP
or an input-output naming, withDatabase.INPUT_GROUP
andDatabase.OUTPUT_GROUP
- Parameters
name (Optional[str]) –
A name to be given to the dataset. If None, use the name of the
database
.By default it is set to None.
by_group (bool) –
If True, then store the data by group. Otherwise, store them by variables.
By default it is set to True.
categorize (bool) –
If True, then distinguish between the different groups of variables.
By default it is set to True.
opt_naming (bool) –
If True, then use an optimization naming.
By default it is set to True.
export_gradients (bool) –
If True, then export also the gradients of the functions (objective function, constraints and observables) if the latter are available in the database of the optimization problem.
By default it is set to False.
- Returns
A dataset built from the database of the optimization problem.
- Return type
- get_active_ineq_constraints(x_vect, tol=1e-06)¶
For each constraint, indicate if its different components are active.
- Parameters
x_vect (numpy.ndarray) – The vector of design variables.
tol (float) –
The tolerance for deciding whether a constraint is active.
By default it is set to 1e-06.
- Returns
For each constraint, a boolean indicator of activation of its different components.
- Return type
Dict[str, numpy.ndarray]
- get_all_functions()¶
Retrieve all the functions of the optimization problem.
These functions are the constraints, the objective function and the observables.
- Returns
All the functions of the optimization problem.
- Return type
- get_all_functions_names()¶
Retrieve the names of all the function of the optimization problem.
These functions are the constraints, the objective function and the observables.
- Returns
The names of all the functions of the optimization problem.
- Return type
List[str]
- get_best_infeasible_point()¶
Retrieve the best infeasible point within a given tolerance.
- Returns
The best infeasible point expressed as the design variables values, the objective function value, the feasibility of the point and the functions values.
- Return type
Tuple[Optional[numpy.ndarray], Optional[numpy.ndarray], bool, Dict[str, numpy.ndarray]]
- get_constraints_names()¶
Retrieve the names of the constraints.
- Returns
The names of the constraints.
- Return type
List[str]
- get_constraints_number()¶
Retrieve the number of constraints.
- Returns
The number of constraints.
- Return type
int
- get_data_by_names(names, as_dict=True, filter_non_feasible=False)¶
Return the data for specific names of variables.
- Parameters
names (Union[str, Iterable[str]]) – The names of the variables.
as_dict (bool) –
If True, return values as dictionary.
By default it is set to True.
filter_non_feasible (bool) –
If True, remove the non-feasible points from the data.
By default it is set to False.
- Returns
The data related to the variables.
- Return type
Union[numpy.ndarray, Dict[str, numpy.ndarray]]
- get_design_variable_names()¶
Retrieve the names of the design variables.
- Returns
The names of the design variables.
- Return type
List[str]
- get_dimension()¶
Retrieve the total number of design variables.
- Returns
The dimension of the design space.
- Return type
int
- get_eq_constraints()¶
Retrieve all the equality constraints.
- Returns
The equality constraints.
- Return type
- get_eq_constraints_number()¶
Retrieve the number of equality constraints.
- Returns
The number of equality constraints.
- Return type
int
- get_eq_cstr_total_dim()¶
Retrieve the total dimension of the equality constraints.
This dimension is the sum of all the outputs dimensions of all the equality constraints.
- Returns
The total dimension of the equality constraints.
- Return type
int
- get_feasible_points()¶
Retrieve the feasible points within a given tolerance.
This tolerance is defined by
OptimizationProblem.eq_tolerance
for equality constraints andOptimizationProblem.ineq_tolerance
for inequality ones.- Returns
The values of the design variables and objective function for the feasible points.
- Return type
Tuple[List[numpy.ndarray], List[Dict[str, Union[float, List[int]]]]]
- get_functions_dimensions()¶
Return the dimensions of the outputs of the problem functions.
- Returns
The dimensions of the outputs of the problem functions. The dictionary keys are the functions names and the values are the functions dimensions.
- Return type
Dict[str, int]
- get_ineq_constraints()¶
Retrieve all the inequality constraints.
- Returns
The inequality constraints.
- Return type
- get_ineq_constraints_number()¶
Retrieve the number of inequality constraints.
- Returns
The number of inequality constraints.
- Return type
int
- get_ineq_cstr_total_dim()¶
Retrieve the total dimension of the inequality constraints.
This dimension is the sum of all the outputs dimensions of all the inequality constraints.
- Returns
The total dimension of the inequality constraints.
- Return type
int
- get_nonproc_constraints()¶
Retrieve the non-processed constraints.
- Returns
The non-processed constraints.
- Return type
- get_nonproc_objective()¶
Retrieve the non-processed objective function.
- get_number_of_unsatisfied_constraints(design_variables)¶
Return the number of scalar constraints not satisfied by design variables.
- Parameters
design_variables (numpy.ndarray) – The design variables.
- Returns
The number of unsatisfied scalar constraints.
- Return type
int
- get_objective_name()¶
Retrieve the name of the objective function.
- Returns
The name of the objective function.
- Return type
str
- get_observable(name)¶
Retrieve an observable from its name.
- Parameters
name (str) – The name of the observable.
- Returns
The observable.
- Raises
ValueError – If the observable cannot be found.
- Return type
- get_optimum()¶
Return the optimum solution within a given feasibility tolerances.
- Returns
The optimum result, defined by:
the value of the objective function,
the value of the design variables,
the indicator of feasibility of the optimal solution,
the value of the constraints,
the value of the gradients of the constraints.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, bool, Dict[str, numpy.ndarray], Dict[str, numpy.ndarray]]
- get_scalar_constraints_names()¶
Return the names of the scalar constraints.
- Returns
The names of the scalar constraints.
- Return type
List[str]
- static get_solution()[source]¶
Return theoretical optimal value of Rastrigin function.
- Returns
design variables values of optimized values, function value at optimum
- Return type
numpy array
- get_violation_criteria(x_vect)¶
Compute a violation measure associated to an iteration.
For each constraint, when it is violated, add the absolute distance to zero, in L2 norm.
If 0, all constraints are satisfied
- Parameters
x_vect (numpy.ndarray) – The vector of the design variables values.
- Returns
The feasibility of the point and the violation measure.
- Return type
Tuple[bool, float]
- get_x0_normalized()¶
Return the current values of the design variables after normalization.
- Returns
The current values of the design variables normalized between 0 and 1 from their lower and upper bounds.
- Return type
numpy.ndarray
- has_constraints()¶
Check if the problem has equality or inequality constraints.
- Returns
True if the problem has equality or inequality constraints.
- has_eq_constraints()¶
Check if the problem has equality constraints.
- Returns
True if the problem has equality constraints.
- Return type
bool
- has_ineq_constraints()¶
Check if the problem has inequality constraints.
- Returns
True if the problem has inequality constraints.
- Return type
bool
- has_nonlinear_constraints()¶
Check if the problem has non-linear constraints.
- Returns
True if the problem has equality or inequality constraints.
- Return type
bool
- classmethod import_hdf(file_path, x_tolerance=0.0)¶
Import an optimization history from an HDF file.
- Parameters
file_path (str) – The file containing the optimization history.
x_tolerance (float) –
The tolerance on the design variables when reading the file.
By default it is set to 0.0.
- Returns
The read optimization problem.
- Return type
- is_max_iter_reached()¶
Check if the maximum amount of iterations has been reached.
- Returns
Whether the maximum amount of iterations has been reached.
- Return type
bool
- property is_mono_objective¶
Whether the optimization problem is mono-objective.
- is_point_feasible(out_val, constraints=None)¶
Check if a point is feasible.
Note
If the value of a constraint is absent from this point, then this constraint will be considered satisfied.
- Parameters
out_val (Dict[str, numpy.ndarray]) – The values of the objective function, and eventually constraints.
constraints (Optional[Iterable[gemseo.core.mdofunctions.mdo_function.MDOFunction]]) –
The constraints whose values are to be tested. If None, then take all constraints of the problem.
By default it is set to None.
- Returns
The feasibility of the point.
- Return type
bool
- property objective¶
The objective function.
- property parallel_differentiation¶
Whether to approximate the derivatives in parallel.
- property parallel_differentiation_options¶
The options to approximate the derivatives in parallel.
- preprocess_functions(normalize=True, use_database=True, round_ints=True)¶
Pre-process all the functions and eventually the gradient.
Required to wrap the objective function and constraints with the database and eventually the gradients by complex step or finite differences.
- Parameters
normalize (bool) –
Whether to unnormalize the input vector of the function before evaluate it.
By default it is set to True.
use_database (bool) –
If True, then the functions are wrapped in the database.
By default it is set to True.
round_ints (bool) –
If True, then round the integer variables.
By default it is set to True.
- Return type
None
- static rastrigin(x_dv)[source]¶
This function computes the order n=2 Rastrigin function.
- Parameters
x_dv – design variable vector of size 2
- Returns
result of Rastrigin function evaluation
- static rastrigin_jac(x_dv)[source]¶
This function computes the analytical gradient of 2nd order Rastrigin function.
- Parameters
x_dv (numpy array) – design variable vector
- Returns
analytical gradient vector of Rastrigin function
- Return type
numpy array
- static repr_constraint(func, ctype, value=None, positive=False)¶
Express a constraint as a string expression.
- Parameters
func (gemseo.core.mdofunctions.mdo_function.MDOFunction) – The constraint function.
ctype (str) – The type of the constraint. Either equality or inequality.
value (Optional[float]) –
The value for which the constraint is active. If None, this value is 0.
By default it is set to None.
positive (bool) –
If True, then the inequality constraint is positive.
By default it is set to False.
- Returns
A string representation of the constraint.
- Return type
str