opt_problem module¶
Optimization problem¶
The OptimizationProblem
class is used to define
the optimization problem from a DesignSpace
defining:
Initial guess \(x_0\)
Bounds \(l_b \leq x \leq u_b\)
(Possible vector) objective function is defined as a MDOFunction
and set using the objective
attribute.
If the optimization problem looks for the maximum of this objective
function, the OptimizationProblem.change_objective_sign()
changes
the objective function sign because the optimization drivers seek to
minimize this objective function.
Equality and inequality constraints are also MDOFunction
s
provided to the OptimizationProblem
by means of its
OptimizationProblem.add_constraint()
method.
The OptimizationProblem
allows to evaluate the different
functions for a given design parameters vector
(see OptimizationProblem.evaluate_functions()
). Note that
this evaluation step relies on an automated scaling of function wrt bounds
so that optimizers and DOE algorithms work with scaled inputs
between 0 and 1 for all variables.
The OptimizationProblem
has also a Database
storing
the calls to all functions so that no function is called twice
with the same inputs. Concerning the derivatives computation,
the OptimizationProblem
automates the generation of finite
differences or complex step wrappers on functions,
when analytical gradient is not available.
Lastly, various getters and setters are available, as well as methods
to export the Database
to an HDF file or to a Dataset
for future post-processing.
-
class
gemseo.algos.opt_problem.
OptimizationProblem
(design_space, pb_type='non-linear', input_database=None, differentiation_method='user', fd_step=1e-07)[source]¶ Bases:
object
An optimization problem class to store:
A (possibly vector) objective function
Constraints, equality and inequality
Initial guess x_0
Bounds l_bounds<= x <= u_bounds
The database of calls to all functions so that no function is called twice with the same inputs
It also has an automated scaling of function wrt bounds so that optimizers and DOE algorithms work with scaled inputs between 0 and 1 for all variables.
It automates the generation of finite differences or complex step wrappers on functions, when analytical gradient is not available
Constructor for the optimization problem
- Parameters
pb_type – the type of optimization problem among OptimizationProblem.AVAILABLE_PB_TYPES
input_database – a file to eventually load the database
differentiation_method – the default differentiation method for the functions of the optimization problem
fd_step – finite differences or complex step step
-
AVAILABLE_PB_TYPES
= ['linear', 'non-linear']¶
-
COMPLEX_STEP
= 'complex_step'¶
-
CONSTRAINTS_GROUP
= 'constraints'¶
-
DESIGN_SPACE_ATTRS
= ['u_bounds', 'l_bounds', 'x_0', 'x_names', 'dimension']¶
-
DESIGN_SPACE_GROUP
= 'design_space'¶
-
DESIGN_VAR_NAMES
= 'x_names'¶
-
DESIGN_VAR_SIZE
= 'x_size'¶
-
DIFFERENTIATION_METHODS
= ['user', 'complex_step', 'finite_differences', 'no_derivatives']¶
-
FINITE_DIFFERENCES
= 'finite_differences'¶
-
FUNCTIONS_ATTRS
= ['objective', 'constraints']¶
-
GGOBI_FORMAT
= 'ggobi'¶
-
HDF5_FORMAT
= 'hdf5'¶
-
LINEAR_PB
= 'linear'¶
-
NON_LINEAR_PB
= 'non-linear'¶
-
NO_DERIVATIVES
= 'no_derivatives'¶
-
OBJECTIVE_GROUP
= 'objective'¶
-
OPTIM_DESCRIPTION
= ['minimize_objective', 'fd_step', 'differentiation_method', 'pb_type', 'ineq_tolerance', 'eq_tolerance']¶
-
OPT_DESCR_GROUP
= 'opt_description'¶
-
SOLUTION_GROUP
= 'solution'¶
-
USER_GRAD
= 'user'¶
-
add_callback
(callback_func, each_new_iter=True, each_store=False)[source]¶ Adds a callback function after each store operation or new iteration
- Parameters
callback_func – a function called after the function if None nothing
each_new_iter – if True, callback at every iteration
each_store – if True, callback at every call to store() in the database
-
add_constraint
(cstr_func, value=None, cstr_type=None, positive=False)[source]¶ Add constraints (equality and inequality) from MDOFunction
- Parameters
cstr_func – constraints as an MDOFunction
value – the target value for the constraint by default cstr(x)<= 0 or cstr(x)= 0 otherwise cstr(x)<=value
cstr_type – constraint type (equality or inequality) (Default value = None)
positive – positive/negative inequality constraint (Default value = False)
-
add_eq_constraint
(cstr_func, value=None)[source]¶ Add equality constraints to the optimization problem
- Parameters
cstr_func – MDOFunction constraints
value – the target value for the constraint by default, cstr(x)=0 otherwise cstr(x)=value
-
add_ineq_constraint
(cstr_func, value=None, positive=False)[source]¶ Add inequality constraints to the optimization problem
- Parameters
cstr_func – MDOFunction constraints
value – the target value for the constraint by default, cstr(x)<= 0 otherwise cstr(x)<=value
positive – if True, the constraint should be cstr(x)>= value, by default cstr(x)<= value
-
add_new_iter_listener
(listener_func)[source]¶ When a new iteration stored to the database, calls the listener functions
:param listener_func : function to be called :param args: optional arguments of function
-
add_observable
(obs_func, new_iter=True)[source]¶ Adds observable as an MDOFunction.
- Parameters
obs_func (MDOFunction) – observable as an MDOFunction
new_iter (bool) – if True, the observable will be called at each new iterate
-
add_store_listener
(listener_func)[source]¶ When an item is stored to the database, calls the listener functions
:param listener_func : function to be called :param args: optional arguments of function
-
change_objective_sign
()[source]¶ Changes the objective function sign, when it needs to be maximized for instance
-
static
check_format
(input_function)[source]¶ Checks that the input_function is an istance of MDOFunction
- Parameters
input_function – function
-
property
dimension
¶ dimension property, ie dimension of the design space
-
evaluate_functions
(x_vect=None, eval_jac=False, eval_obj=True, normalize=True, no_db_no_norm=False)[source]¶ Compute objective and constraints at x_normed Some libraries require the number of constraints as an input parameter which is unknown by formulation/scenario. Evaluation of initial point allows to get this mandatory informations. Also used for DOE to evaluate samples
- Parameters
x_normed – the normalized vector at which the point must be evaluated if None, x_0 is used (Default value = None)
eval_jac – if True, the jacobian is also evaluated (Default value = False)
eval_obj – if True, the objective is evaluated (Default value = True)
no_db_no_norm – if True, dont use preprocessed functions, so we have no database, nor normalization
-
export_hdf
(file_path, append=False)[source]¶ Export optimization problem to hdf file.
- Parameters
file_path – file to store the data
append – if True, data is appended to the file if not empty (Default value = False)
-
export_to_dataset
(name, by_group=True, categorize=True, opt_naming=True)[source]¶ Export the optimization problem to a
Dataset
.- Parameters
name (str) – dataset name.
by_group (bool) – if True, store the data by group. Otherwise, store them by variables. Default: True
categorize (bool) – distinguish between the different groups of variables. Default: True.
- Parma bool opt_naming
use an optimization naming. Default: True.
-
get_active_ineq_constraints
(x_vect, tol=1e-06)[source]¶ Returns active constraints names and indices
- Parameters
x_vect – vector of x values, not normalized
-
get_all_functions
()[source]¶ Returns a list of all functions of the MDO problem optimization constraints and objective
-
get_all_functions_names
()[source]¶ Get all constraints and objective names
- Returns
a list of names of all functions of the MDO problem optimization constraints and objective
-
get_constraints_names
()[source]¶ Get all constraints names as a list
- Returns
the list of constraints names
-
get_constraints_number
()[source]¶ Computes the number of equality constraints
- Returns
the number of equality constraints
-
get_dimension
()[source]¶ Get the total number of design variables
- Returns
the dimension of the design space
-
get_eq_constraints
()[source]¶ Accessor for all equality constraints
- Returns
a list of equality constraints
-
get_eq_constraints_number
()[source]¶ Computes the number of equality constraints
- Returns
the number of equality constraints
-
get_eq_cstr_total_dim
()[source]¶ Returns the total number of equality constraints dimensions that is the sum of all outputs dimensions of all constraints
- Returns
total number of equality constraints
-
get_feasible_points
()[source]¶ Return the list of feasible points within a given tolerance eq_tolerance and ineq_tolerance are taken from sel attrs
-
get_ineq_constraints
()[source]¶ Accessor for all equality constraints
- Returns
a list of equality constraints
-
get_ineq_constraints_number
()[source]¶ Computes the number of inequality constraints
- Returns
the number of inequality constraints
-
get_ineq_cstr_total_dim
()[source]¶ Returns the total number of inequality constraints dimensions that is the sum of all outputs dimensions of all constraints
- Returns
total number of inequality constraints dimensions
-
get_objective_name
()[source]¶ Get objective function name
- Returns
the name of the actual objective function
-
get_observable
(name)[source]¶ Returns the required observable.
- Parameters
name (str) – name of the observable
-
get_optimum
()[source]¶ Return the optimum solution within a given feasibility tolerances
- Returns
tuple, best evaluation iteration and solution
-
get_violation_criteria
(x_vect)[source]¶ Computes a violation measure associated to an iteration For each constraints, when it is violated, add the absolute distance to zero, in L2 norm
if 0, all constraints are satisfied
- Parameters
x_vect – vector of design variables
- Returns
True if feasible, and the violation criteria
-
has_constraints
()[source]¶ Checks if the problem has equality or inequality constraints
- Returns
True if the problem has constraints
-
has_eq_constraints
()[source]¶ Checks if the problem has equality constraints
- Returns
True if the problem has equality constraints
-
has_ineq_constraints
()[source]¶ Checks if the problem has inequality constraints
- Returns
True if the problem has inequality constraints
-
has_nonlinear_constraints
()[source]¶ Checks if the problem has constraints
- Returns
True if the problem has equality or inequality constraints
-
static
import_hdf
(file_path, x_tolerance=0.0)[source]¶ Imports optimization history from hdf file
- Parameters
file_path – file to deserialize
- Returns
the read optimization problem
-
is_point_feasible
(out_val, constraints=None)[source]¶ Returns True if the point is feasible
- Parameters
out_val – dict of values, containing objective function and eventually constraints. Warning: if the constraint value is not present, the constraint will be considered satisfied
constraints – the list of constraints (MDOFunctions) to check. If None, takes all constraints of the problem
-
log_me
(max_ds_size=40)[source]¶ Logs a representation of the optimization problem characteristics logs self.__repr__ message
- Parameters
max_ds_size – maximum design space dimension to print
-
preprocess_functions
(normalize=True, use_database=True, round_ints=True)[source]¶ Preprocesses all the functions: objective and constraints to wrap them with the database and eventually the gradients by complex step or FD :param normalize: if True, the function is normalized :type normalize: bool :param use_database: if True, the function is wrapped in the database :type use_database: bool :param round_ints: if True, rounds integer variables :type round_ints: bool