.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/mlearning/regression_model/plot_moe.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_mlearning_regression_model_plot_moe.py: Mixture of experts ================== In this demo, we load a dataset (the Rosenbrock function in 2D) and apply a mixture of experts regression model to obtain an approximation. .. GENERATED FROM PYTHON SOURCE LINES 28-45 .. code-block:: default from __future__ import annotations import matplotlib.pyplot as plt from gemseo import configure_logger from gemseo import create_benchmark_dataset from gemseo.mlearning import create_regression_model from numpy import array from numpy import hstack from numpy import linspace from numpy import meshgrid from numpy import nonzero from numpy import sqrt from numpy import zeros configure_logger() .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 46-50 Dataset (Rosenbrock) -------------------- We here consider the Rosenbrock function with two inputs, on the interval :math:`[-2, 2] \times [-2, 2]`. .. GENERATED FROM PYTHON SOURCE LINES 52-57 Load dataset ~~~~~~~~~~~~ A prebuilt dataset for the Rosenbrock function with two inputs is given as a dataset parametrization, based on a full factorial DOE of the input space with 100 points. .. GENERATED FROM PYTHON SOURCE LINES 57-59 .. code-block:: default dataset = create_benchmark_dataset("RosenbrockDataset", opt_naming=False) .. GENERATED FROM PYTHON SOURCE LINES 60-64 Print information ~~~~~~~~~~~~~~~~~ Information about the dataset can easily be displayed by printing the dataset directly. .. GENERATED FROM PYTHON SOURCE LINES 64-66 .. code-block:: default print(dataset) .. rst-class:: sphx-glr-script-out .. code-block:: none GROUP inputs outputs VARIABLE x rosen COMPONENT 0 1 0 0 -2.000000 -2.0 3609.000000 1 -1.555556 -2.0 1959.952599 2 -1.111111 -2.0 1050.699741 3 -0.666667 -2.0 600.308642 4 -0.222222 -2.0 421.490779 .. ... ... ... 95 0.222222 2.0 381.095717 96 0.666667 2.0 242.086420 97 1.111111 2.0 58.600975 98 1.555556 2.0 17.927907 99 2.000000 2.0 401.000000 [100 rows x 3 columns] .. GENERATED FROM PYTHON SOURCE LINES 67-70 Show dataset ~~~~~~~~~~~~ The dataset object can present the data in tabular form. .. GENERATED FROM PYTHON SOURCE LINES 70-72 .. code-block:: default print(dataset) .. rst-class:: sphx-glr-script-out .. code-block:: none GROUP inputs outputs VARIABLE x rosen COMPONENT 0 1 0 0 -2.000000 -2.0 3609.000000 1 -1.555556 -2.0 1959.952599 2 -1.111111 -2.0 1050.699741 3 -0.666667 -2.0 600.308642 4 -0.222222 -2.0 421.490779 .. ... ... ... 95 0.222222 2.0 381.095717 96 0.666667 2.0 242.086420 97 1.111111 2.0 58.600975 98 1.555556 2.0 17.927907 99 2.000000 2.0 401.000000 [100 rows x 3 columns] .. GENERATED FROM PYTHON SOURCE LINES 73-77 Mixture of experts (MoE) ------------------------ In this section we load a mixture of experts regression model through the machine learning API, using clustering, classification and regression models. .. GENERATED FROM PYTHON SOURCE LINES 79-83 Mixture of experts model ~~~~~~~~~~~~~~~~~~~~~~~~ We construct the MoE model using the predefined parameters, and fit the model to the dataset through the :meth:`~.MOERegressor.learn` method. .. GENERATED FROM PYTHON SOURCE LINES 83-89 .. code-block:: default model = create_regression_model("MOERegressor", dataset) model.set_clusterer("KMeans", n_clusters=3) model.set_classifier("KNNClassifier", n_neighbors=5) model.set_regressor("GaussianProcessRegressor") model.learn() .. rst-class:: sphx-glr-script-out .. code-block:: none /home/docs/checkouts/readthedocs.org/user_builds/gemseo/envs/5.0.0/lib/python3.9/site-packages/sklearn/cluster/_kmeans.py:870: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning warnings.warn( /home/docs/checkouts/readthedocs.org/user_builds/gemseo/envs/5.0.0/lib/python3.9/site-packages/sklearn/gaussian_process/_gpr.py:629: ConvergenceWarning: lbfgs failed to converge (status=2): ABNORMAL_TERMINATION_IN_LNSRCH. Increase the number of iterations (max_iter) or scale the data as shown in: https://scikit-learn.org/stable/modules/preprocessing.html _check_optimize_result("lbfgs", opt_res) .. GENERATED FROM PYTHON SOURCE LINES 90-96 Tests ~~~~~ Here, we test the mixture of experts method applied to two points: (1, 1), the global minimum, where the function is zero, and (-2, -2), an extreme point where the function has a high value (max on the domain). The classes are expected to be different at the two points. .. GENERATED FROM PYTHON SOURCE LINES 96-108 .. code-block:: default input_value = {"x": array([1, 1])} another_input_value = {"x": array([[1, 1], [-2, -2]])} for value in [input_value, another_input_value]: print("Input value:", value) print("Class:", model.predict_class(value)) print("Prediction:", model.predict(value)) print("Local model predictions:") for cls in range(model.n_clusters): print(f"Local model {cls}: {model.predict_local_model(value, cls)}") print() .. rst-class:: sphx-glr-script-out .. code-block:: none Input value: {'x': array([1, 1])} Class: {'labels': array([0])} Prediction: {'rosen': array([3.30755365])} Local model predictions: Local model 0: {'rosen': array([3.30755365])} Local model 1: {'rosen': array([-10.21814956])} Local model 2: {'rosen': array([336.33838238])} Input value: {'x': array([[ 1, 1], [-2, -2]])} Class: {'labels': array([[0], [2]])} Prediction: {'rosen': array([[3.30755365e+00], [3.60899961e+03]])} Local model predictions: Local model 0: {'rosen': array([[ 3.30755365], [2945.14898786]])} Local model 1: {'rosen': array([[-10.21814956], [698.64243782]])} Local model 2: {'rosen': array([[ 336.33838238], [3608.99960672]])} .. GENERATED FROM PYTHON SOURCE LINES 109-114 Plot clusters ~~~~~~~~~~~~~ Here, we plot the 10x10 = 100 Rosenbrock function data points, with colors representing the obtained clusters. The Rosenbrock function is represented by a contour plot in the background. .. GENERATED FROM PYTHON SOURCE LINES 114-137 .. code-block:: default n_samples = dataset.n_samples # Dataset is based on a DOE of 100=10^2 fullfact. input_dim = int(sqrt(n_samples)) assert input_dim**2 == n_samples # Check that n_samples is a square number colors = ["b", "r", "g", "o", "y"] inputs = dataset.input_dataset.to_numpy() outputs = dataset.output_dataset.to_numpy() x = inputs[:input_dim, 0] y = inputs[:input_dim, 0] Z = zeros((input_dim, input_dim)) for i in range(input_dim): Z[i, :] = outputs[input_dim * i : input_dim * (i + 1), 0] fig = plt.figure() cnt = plt.contour(x, y, Z, 50) fig.colorbar(cnt) for index in range(model.n_clusters): samples = nonzero(model.labels == index)[0] plt.scatter(inputs[samples, 0], inputs[samples, 1], color=colors[index]) plt.scatter(1, 1, marker="x") plt.show() .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_001.png :alt: plot moe :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 138-141 Plot data and predictions from final model ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We construct a refined input space, and compute the model predictions. .. GENERATED FROM PYTHON SOURCE LINES 141-164 .. code-block:: default refinement = 200 fine_x = linspace(x[0], x[-1], refinement) fine_y = linspace(y[0], y[-1], refinement) fine_x, fine_y = meshgrid(fine_x, fine_y) fine_input = {"x": hstack([fine_x.flatten()[:, None], fine_y.flatten()[:, None]])} fine_z = model.predict(fine_input) # Reshape fine_z = fine_z["rosen"].reshape((refinement, refinement)) plt.figure() plt.imshow(Z) plt.colorbar() plt.title("Original data") plt.show() plt.figure() plt.imshow(fine_z) plt.colorbar() plt.title("Predictions") plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_002.png :alt: Original data :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_002.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_003.png :alt: Predictions :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_003.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 165-167 Plot local models ~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 167-177 .. code-block:: default for i in range(model.n_clusters): plt.figure() plt.imshow( model.predict_local_model(fine_input, i)["rosen"].reshape( (refinement, refinement) ) ) plt.colorbar() plt.title(f"Local model {i}") plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_004.png :alt: Local model 0 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_004.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_005.png :alt: Local model 1 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_005.png :class: sphx-glr-multi-img * .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_006.png :alt: Local model 2 :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_moe_006.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 3.907 seconds) .. _sphx_glr_download_examples_mlearning_regression_model_plot_moe.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_moe.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_moe.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_