gemseo.mlearning.regression.algos.mlp_settings module#
Settings of the multiLayer perceptron (MLP).
- Settings MLPRegressor_Settings(*, transformer=<factory>, parameters=<factory>, input_names=(), output_names=(), hidden_layer_sizes=(100, ), random_state=0)[source]#
Bases:
BaseRegressorSettings
The settings of the multiLayer perceptron (MLP).
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
transformer (Mapping[str, Any]) --
By default it is set to <factory>.
parameters (Mapping[str, Any]) --
By default it is set to <factory>.
input_names (Sequence[str]) --
By default it is set to ().
output_names (Sequence[str]) --
By default it is set to ().
hidden_layer_sizes (tuple[Annotated[int, Gt(gt=0)], ...]) --
By default it is set to (100,).
random_state (Annotated[int, Ge(ge=0)] | None) --
By default it is set to 0.
- Return type:
None
The number of neurons per hidden layer.
- random_state: NonNegativeInt | None = 0#
The random state parameter.
If
None
, use the global random state instance fromnumpy.random
. Creating the model multiple times will produce different results. Ifint
, use a new random number generator seeded by this integer. This will produce the same results.