.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/mlearning/regression_model/plot_moe.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_mlearning_regression_model_plot_moe.py: Mixture of experts ================== In this demo, we load a dataset (the Rosenbrock function in 2D) and apply a mixture of experts regression model to obtain an approximation. .. GENERATED FROM PYTHON SOURCE LINES 28-47 .. code-block:: Python from __future__ import annotations import matplotlib.pyplot as plt from numpy import array from numpy import hstack from numpy import linspace from numpy import meshgrid from numpy import nonzero from numpy import sqrt from numpy import zeros from gemseo import configure_logger from gemseo import create_benchmark_dataset from gemseo.mlearning import create_regression_model configure_logger() .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 48-52 Dataset (Rosenbrock) -------------------- We here consider the Rosenbrock function with two inputs, on the interval :math:`[-2, 2] \times [-2, 2]`. .. GENERATED FROM PYTHON SOURCE LINES 54-59 Load dataset ~~~~~~~~~~~~ A prebuilt dataset for the Rosenbrock function with two inputs is given as a dataset parametrization, based on a full factorial DOE of the input space with 100 points. .. GENERATED FROM PYTHON SOURCE LINES 59-61 .. code-block:: Python dataset = create_benchmark_dataset("RosenbrockDataset", opt_naming=False) .. GENERATED FROM PYTHON SOURCE LINES 62-66 Print information ~~~~~~~~~~~~~~~~~ Information about the dataset can easily be displayed by printing the dataset directly. .. GENERATED FROM PYTHON SOURCE LINES 66-68 .. code-block:: Python dataset .. raw:: html
GROUP inputs outputs
VARIABLE x rosen
COMPONENT 0 1 0
0 -2.000000 -2.0 3609.000000
1 -1.555556 -2.0 1959.952599
2 -1.111111 -2.0 1050.699741
3 -0.666667 -2.0 600.308642
4 -0.222222 -2.0 421.490779
... ... ... ...
95 0.222222 2.0 381.095717
96 0.666667 2.0 242.086420
97 1.111111 2.0 58.600975
98 1.555556 2.0 17.927907
99 2.000000 2.0 401.000000

100 rows × 3 columns



.. GENERATED FROM PYTHON SOURCE LINES 69-72 Show dataset ~~~~~~~~~~~~ The dataset object can present the data in tabular form. .. GENERATED FROM PYTHON SOURCE LINES 72-74 .. code-block:: Python dataset .. raw:: html
GROUP inputs outputs
VARIABLE x rosen
COMPONENT 0 1 0
0 -2.000000 -2.0 3609.000000
1 -1.555556 -2.0 1959.952599
2 -1.111111 -2.0 1050.699741
3 -0.666667 -2.0 600.308642
4 -0.222222 -2.0 421.490779
... ... ... ...
95 0.222222 2.0 381.095717
96 0.666667 2.0 242.086420
97 1.111111 2.0 58.600975
98 1.555556 2.0 17.927907
99 2.000000 2.0 401.000000

100 rows × 3 columns



.. GENERATED FROM PYTHON SOURCE LINES 75-79 Mixture of experts (MoE) ------------------------ In this section we load a mixture of experts regression model through the machine learning API, using clustering, classification and regression models. .. GENERATED FROM PYTHON SOURCE LINES 81-85 Mixture of experts model ~~~~~~~~~~~~~~~~~~~~~~~~ We construct the MoE model using the predefined parameters, and fit the model to the dataset through the :meth:`~.MOERegressor.learn` method. .. GENERATED FROM PYTHON SOURCE LINES 85-91 .. code-block:: Python model = create_regression_model("MOERegressor", dataset) model.set_clusterer("KMeans", n_clusters=3) model.set_classifier("KNNClassifier", n_neighbors=5) model.set_regressor("GaussianProcessRegressor") model.learn() .. GENERATED FROM PYTHON SOURCE LINES 92-98 Tests ~~~~~ Here, we test the mixture of experts method applied to two points: (1, 1), the global minimum, where the function is zero, and (-2, -2), an extreme point where the function has a high value (max on the domain). The classes are expected to be different at the two points. .. GENERATED FROM PYTHON SOURCE LINES 98-110 .. code-block:: Python input_value = {"x": array([1, 1])} another_input_value = {"x": array([[1, 1], [-2, -2]])} for value in [input_value, another_input_value]: print("Input value:", value) print("Class:", model.predict_class(value)) print("Prediction:", model.predict(value)) print("Local model predictions:") for cls in range(model.n_clusters): print(f"Local model {cls}: {model.predict_local_model(value, cls)}") print() .. rst-class:: sphx-glr-script-out .. code-block:: none Input value: {'x': array([1, 1])} Class: {'labels': array([0])} Prediction: {'rosen': array([3.17328633])} Local model predictions: Local model 0: {'rosen': array([3.17328633])} Local model 1: {'rosen': array([-36.28112663])} Local model 2: {'rosen': array([129.41011927])} Input value: {'x': array([[ 1, 1], [-2, -2]])} Class: {'labels': array([[0], [2]])} Prediction: {'rosen': array([[3.17328633e+00], [3.60899957e+03]])} Local model predictions: Local model 0: {'rosen': array([[ 3.17328633], [2966.75224828]])} Local model 1: {'rosen': array([[-36.28112663], [296.24577068]])} Local model 2: {'rosen': array([[ 129.41011927], [3608.99957458]])} .. GENERATED FROM PYTHON SOURCE LINES 111-116 Plot clusters ~~~~~~~~~~~~~ Here, we plot the 10x10 = 100 Rosenbrock function data points, with colors representing the obtained clusters. The Rosenbrock function is represented by a contour plot in the background. .. GENERATED FROM PYTHON SOURCE LINES 116-139 .. code-block:: Python n_samples = dataset.n_samples # Dataset is based on a DOE of 100=10^2 fullfact. input_dim = int(sqrt(n_samples)) assert input_dim**2 == n_samples # Check that n_samples is a square number colors = ["b", "r", "g", "o", "y"] inputs = dataset.input_dataset.to_numpy() outputs = dataset.output_dataset.to_numpy() x = inputs[:input_dim, 0] y = inputs[:input_dim, 0] Z = zeros((input_dim, input_dim)) for i in range(input_dim): Z[i, :] = outputs[input_dim * i : input_dim * (i + 1), 0] fig = plt.figure() cnt = plt.contour(x, y, Z, 50) fig.colorbar(cnt) for index in range(model.n_clusters): samples = nonzero(model.labels == index)[0] plt.scatter(inputs[samples, 0], inputs[samples, 1], color=colors[index]) plt.scatter(1, 1, marker="x") plt.show() .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_001.png :alt: plot moe :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 140-143 Plot data and predictions from final model ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We construct a refined input space, and compute the model predictions. .. GENERATED FROM PYTHON SOURCE LINES 143-166 .. code-block:: Python refinement = 200 fine_x = linspace(x[0], x[-1], refinement) fine_y = linspace(y[0], y[-1], refinement) fine_x, fine_y = meshgrid(fine_x, fine_y) fine_input = {"x": hstack([fine_x.flatten()[:, None], fine_y.flatten()[:, None]])} fine_z = model.predict(fine_input) # Reshape fine_z = fine_z["rosen"].reshape((refinement, refinement)) plt.figure() plt.imshow(Z) plt.colorbar() plt.title("Original data") plt.show() plt.figure() plt.imshow(fine_z) plt.colorbar() plt.title("Predictions") plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_002.png :alt: Original data :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_002.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_003.png :alt: Predictions :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_003.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 167-169 Plot local models ~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 169-180 .. code-block:: Python for i in range(model.n_clusters): plt.figure() plt.imshow( model.predict_local_model(fine_input, i)["rosen"].reshape(( refinement, refinement, )) ) plt.colorbar() plt.title(f"Local model {i}") plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_004.png :alt: Local model 0 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_004.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_005.png :alt: Local model 1 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_005.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_006.png :alt: Local model 2 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_006.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 4.408 seconds) .. _sphx_glr_download_examples_mlearning_regression_model_plot_moe.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_moe.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_moe.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_