gemseo.algos.opt.scipy_local.settings.lbfgsb module#
Settings for the SciPy L-BFGS-B algorithm.
- Settings L_BFGS_B_Settings(*, kkt_tol_abs=inf, kkt_tol_rel=inf, enable_progress_bar=None, eq_tolerance=1e-06, ineq_tolerance=0.0001, log_problem=True, max_time=0.0, normalize_design_space=True, reset_iteration_counters=True, round_ints=True, use_database=True, use_one_line_progress_bar=False, store_jacobian=True, ftol_rel=1e-09, ftol_abs=1e-09, max_iter=1000, scaling_threshold=None, stop_crit_n_x=3, xtol_rel=1e-09, xtol_abs=1e-09, disp=False, maxcor=20, gtol=1e-06, iprint=-1, maxls=20)[source]#
Bases:
BaseScipyLocalSettings
,BaseGradientBasedAlgorithmSettings
Settings for the SciPy L-BFGS-B algorithm.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
kkt_tol_abs (Annotated[float, Ge(ge=0)]) --
By default it is set to inf.
kkt_tol_rel (Annotated[float, Ge(ge=0)]) --
By default it is set to inf.
enable_progress_bar (bool | None)
eq_tolerance (Annotated[float, Ge(ge=0), Ge(ge=0), Ge(ge=0)]) --
By default it is set to 1e-06.
ineq_tolerance (Annotated[float, Ge(ge=0), Ge(ge=0)]) --
By default it is set to 0.0001.
log_problem (bool) --
By default it is set to True.
max_time (Annotated[float, Ge(ge=0), Ge(ge=0)]) --
By default it is set to 0.0.
normalize_design_space (bool) --
By default it is set to True.
reset_iteration_counters (bool) --
By default it is set to True.
round_ints (bool) --
By default it is set to True.
use_database (bool) --
By default it is set to True.
use_one_line_progress_bar (bool) --
By default it is set to False.
store_jacobian (bool) --
By default it is set to True.
ftol_rel (Annotated[float, Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0)]) --
By default it is set to 1e-09.
ftol_abs (Annotated[float, Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0)]) --
By default it is set to 1e-09.
max_iter (Annotated[int, Gt(gt=0), Gt(gt=0), Gt(gt=0)]) --
By default it is set to 1000.
stop_crit_n_x (Annotated[int, Ge(ge=2)]) --
By default it is set to 3.
xtol_rel (Annotated[float, Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0)]) --
By default it is set to 1e-09.
xtol_abs (Annotated[float, Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0), Ge(ge=0)]) --
By default it is set to 1e-09.
disp (bool) --
By default it is set to False.
maxcor (Annotated[int, Gt(gt=0)]) --
By default it is set to 20.
gtol (Annotated[float, Ge(ge=0)]) --
By default it is set to 1e-06.
iprint (int) --
By default it is set to -1.
maxls (Annotated[int, Gt(gt=0)]) --
By default it is set to 20.
- Return type:
None
- gtol: NonNegativeFloat = 1e-06#
The precision goal for the projected gradient value to stop the algorithm.
- Constraints:
ge = 0
- maxcor: PositiveInt = 20#
The maximum number of corrections for the limited memory matrix.
- Constraints:
gt = 0
- maxls: PositiveInt = 20#
The maximum number of line search steps per iteration.
- Constraints:
gt = 0
- model_post_init(context, /)#
We need to both initialize private attributes and call the user-defined model_post_init method.
- Parameters:
self (BaseModel)
context (Any)
- Return type:
None