gemseo.mlearning.regression.algos.gpr_settings module#
Settings of the Gaussian process regressor from scikit-learn.
- Settings GaussianProcessRegressor_Settings(*, transformer=None, parameters=None, input_names=(), output_names=(), kernel=None, bounds=(), alpha=1e-10, optimizer='fmin_l_bfgs_b', n_restarts_optimizer=10, random_state=0)[source]#
Bases:
BaseRegressorSettings
The settings of the Gaussian process regressor from scikit-learn.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
input_names (Sequence[str]) --
By default it is set to ().
output_names (Sequence[str]) --
By default it is set to ().
kernel (Annotated[Kernel, WithJsonSchema(json_schema={}, mode=None)] | None)
bounds (tuple | tuple[float, float] | Mapping[str, tuple[float, float]]) --
By default it is set to ().
alpha (float | _NDArrayPydantic[Any, dtype[_ScalarType_co]]) --
By default it is set to 1e-10.
optimizer (str | Callable[[], Callable]) --
By default it is set to "fmin_l_bfgs_b".
n_restarts_optimizer (Annotated[int, Ge(ge=0)]) --
By default it is set to 10.
random_state (Annotated[int, Ge(ge=0)] | None) --
By default it is set to 0.
- Return type:
None
- bounds: tuple | tuple[float, float] | Mapping[str, tuple[float, float]] = ()#
The lower and upper bounds of the length scales.
Either a unique lower-upper pair common to all the inputs or lower-upper pairs for some of them. When
bounds
is empty or when an input has no pair, the lower bound is 0.01 and the upper bound is 100.This argument is ignored when
kernel
isNone
.
- kernel: Annotated[Kernel, WithJsonSchema({})] | None = None#
The kernel specifying the covariance model.
If
None
, use a Matérn(2.5).
- n_restarts_optimizer: NonNegativeInt = 10#
The number of restarts of the optimizer.
- Constraints:
ge = 0
- optimizer: str | Annotated[Callable, WithJsonSchema({})] = 'fmin_l_bfgs_b'#
The optimization algorithm to find the parameter length scales.
- random_state: NonNegativeInt | None = 0#
The random state parameter.
If
None
, use the global random state instance fromnumpy.random
. Creating the model multiple times will produce different results. Ifint
, use a new random number generator seeded by this integer. This will produce the same results.