gemseo.mlearning.regression.algos.gpr_settings module#

Settings of the Gaussian process regressor from scikit-learn.

Settings GaussianProcessRegressor_Settings(*, transformer=None, parameters=None, input_names=(), output_names=(), kernel=None, bounds=(), alpha=1e-10, optimizer='fmin_l_bfgs_b', n_restarts_optimizer=10, random_state=0)[source]#

Bases: BaseRegressorSettings

The settings of the Gaussian process regressor from scikit-learn.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:
Return type:

None

alpha: float | NDArrayPydantic = 1e-10#

The nugget effect to regularize the model.

bounds: tuple | tuple[float, float] | Mapping[str, tuple[float, float]] = ()#

The lower and upper bounds of the length scales.

Either a unique lower-upper pair common to all the inputs or lower-upper pairs for some of them. When bounds is empty or when an input has no pair, the lower bound is 0.01 and the upper bound is 100.

This argument is ignored when kernel is None.

kernel: Annotated[Kernel, WithJsonSchema({})] | None = None#

The kernel specifying the covariance model.

If None, use a Matérn(2.5).

n_restarts_optimizer: NonNegativeInt = 10#

The number of restarts of the optimizer.

Constraints:
  • ge = 0

optimizer: str | Annotated[Callable, WithJsonSchema({})] = 'fmin_l_bfgs_b'#

The optimization algorithm to find the parameter length scales.

random_state: NonNegativeInt | None = 0#

The random state parameter.

If None, use the global random state instance from numpy.random. Creating the model multiple times will produce different results. If int, use a new random number generator seeded by this integer. This will produce the same results.