.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/mlearning/regression_model/plot_advanced_moe.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_mlearning_regression_model_plot_advanced_moe.py: Advanced mixture of experts =========================== .. GENERATED FROM PYTHON SOURCE LINES 27-35 .. code-block:: default from __future__ import absolute_import, division, print_function, unicode_literals from gemseo.api import load_dataset from gemseo.mlearning.api import create_regression_model from gemseo.mlearning.qual_measure.f1_measure import F1Measure from gemseo.mlearning.qual_measure.mse_measure import MSEMeasure from gemseo.mlearning.qual_measure.silhouette import SilhouetteMeasure .. GENERATED FROM PYTHON SOURCE LINES 36-38 In this example, we seek to estimate the Rosenbrock function from the :class:`.RosenbrockDataset`. .. GENERATED FROM PYTHON SOURCE LINES 38-40 .. code-block:: default dataset = load_dataset("RosenbrockDataset", opt_naming=False) .. GENERATED FROM PYTHON SOURCE LINES 41-58 For that purpose, we will use a :class:`.MixtureOfExperts` in an advanced way: we will not set the clustering, classification and regression algorithms but select them according to their performance from several candidates that we will provide. Moreover, for a given candidate, we will propose several settings, compare their performances and select the best one. Initialization -------------- First, we initialize a :class:`.MixtureOfExperts` with soft classification by means of the machine learning API function :meth:`~gemseo.mlearning.api.create_regression_model`. .. GENERATED FROM PYTHON SOURCE LINES 58-60 .. code-block:: default model = create_regression_model("MixtureOfExperts", dataset, hard=False) .. GENERATED FROM PYTHON SOURCE LINES 61-72 Clustering ---------- Then, we add two clustering algorithms with different numbers of clusters (called *components* for the Gaussian Mixture) and set the :class:`.SilhouetteMeasure` as clustering measure to be evaluated from the learning set. During the learning stage, the mixture of experts will select the clustering algorithm and the number of clusters minimizing this measure. .. GENERATED FROM PYTHON SOURCE LINES 72-76 .. code-block:: default model.set_clustering_measure(SilhouetteMeasure) model.add_clusterer_candidate("KMeans", n_clusters=[2, 3, 4]) model.add_clusterer_candidate("GaussianMixture", n_components=[3, 4, 5]) .. GENERATED FROM PYTHON SOURCE LINES 77-86 Classification -------------- We also add classification algorithms with different settings and set the :class:`.F1Measure` as classification measure to be evaluated from the learning set. During the learning stage, the mixture of experts will select the classification algorithm and the settings minimizing this measure. .. GENERATED FROM PYTHON SOURCE LINES 86-90 .. code-block:: default model.set_classification_measure(F1Measure) model.add_classifier_candidate("KNNClassifier", n_neighbors=[3, 4, 5]) model.add_classifier_candidate("RandomForestClassifier", n_estimators=[100]) .. GENERATED FROM PYTHON SOURCE LINES 91-98 Regression ---------- We also add regression algorithms and set the :class:`.MSEMeasure` as regression measure to be evaluated from the learning set. During the learning stage, for each cluster, the mixture of experts will select the regression algorithm minimizing this measure. .. GENERATED FROM PYTHON SOURCE LINES 98-102 .. code-block:: default model.set_regression_measure(MSEMeasure) model.add_regressor_candidate("LinearRegression") model.add_regressor_candidate("RBFRegression") .. GENERATED FROM PYTHON SOURCE LINES 103-116 .. note:: We could also add candidates for some learning stages, e.g. clustering and regression, and set the machine learning algorithms for the remaining ones, e.g. classification. Training -------- Lastly, we learn the data and select the best machine learning algorithm for both clustering, classification and regression steps. .. GENERATED FROM PYTHON SOURCE LINES 116-118 .. code-block:: default model.learn() .. rst-class:: sphx-glr-script-out Out: .. code-block:: none /home/docs/checkouts/readthedocs.org/user_builds/gemseo/conda/3.2.1/lib/python3.8/site-packages/sklearn/linear_model/_base.py:148: FutureWarning: 'normalize' was deprecated in version 1.0 and will be removed in 1.2. Please leave the normalize parameter to its default value to silence this warning. The default behavior of this estimator is to not do any normalization. If normalization is needed please use sklearn.preprocessing.StandardScaler instead. warnings.warn( /home/docs/checkouts/readthedocs.org/user_builds/gemseo/conda/3.2.1/lib/python3.8/site-packages/sklearn/linear_model/_base.py:148: FutureWarning: 'normalize' was deprecated in version 1.0 and will be removed in 1.2. Please leave the normalize parameter to its default value to silence this warning. The default behavior of this estimator is to not do any normalization. If normalization is needed please use sklearn.preprocessing.StandardScaler instead. warnings.warn( /home/docs/checkouts/readthedocs.org/user_builds/gemseo/conda/3.2.1/lib/python3.8/site-packages/sklearn/linear_model/_base.py:148: FutureWarning: 'normalize' was deprecated in version 1.0 and will be removed in 1.2. Please leave the normalize parameter to its default value to silence this warning. The default behavior of this estimator is to not do any normalization. If normalization is needed please use sklearn.preprocessing.StandardScaler instead. warnings.warn( /home/docs/checkouts/readthedocs.org/user_builds/gemseo/conda/3.2.1/lib/python3.8/site-packages/sklearn/linear_model/_base.py:148: FutureWarning: 'normalize' was deprecated in version 1.0 and will be removed in 1.2. Please leave the normalize parameter to its default value to silence this warning. The default behavior of this estimator is to not do any normalization. If normalization is needed please use sklearn.preprocessing.StandardScaler instead. warnings.warn( .. GENERATED FROM PYTHON SOURCE LINES 119-128 Result ------ We can get information on this model, on the sub-machine learning models selected among the candidates and on their selected settings. We can see that a :class:`.KMeans` with four clusters has been selected for the clustering stage, as well as a :class:`.RandomForestClassifier` for the classification stage and a :class:`.RBFRegression` for each cluster. .. GENERATED FROM PYTHON SOURCE LINES 128-130 .. code-block:: default print(model) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none MixtureOfExperts(hard=False) built from 100 learning samples Clustering KMeans(n_clusters=4, random_state=0, var_names=None) Classification RandomForestClassifier(n_estimators=100) Regression Local model 0 RBFRegression(epsilon=None, function='multiquadric', norm='euclidean', smooth=0.0) Local model 1 RBFRegression(epsilon=None, function='multiquadric', norm='euclidean', smooth=0.0) Local model 2 RBFRegression(epsilon=None, function='multiquadric', norm='euclidean', smooth=0.0) Local model 3 RBFRegression(epsilon=None, function='multiquadric', norm='euclidean', smooth=0.0) .. GENERATED FROM PYTHON SOURCE LINES 131-149 .. note:: By adding candidates, and depending on the complexity of the function to be approximated, one could obtain different regression models according to the clusters. For example, one could use a :class:`.PolynomialRegression` with order 2 on a sub-part of the input space and a :class:`.GaussianProcessRegression` on another sub-part of the input space. Once built, this mixture of experts can be used as any :class:`.MLRegressionAlgo`. .. seealso:: :ref:`Another example ` proposes a standard use of :class:`.MixtureOfExperts`. .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.512 seconds) .. _sphx_glr_download_examples_mlearning_regression_model_plot_advanced_moe.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_advanced_moe.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_advanced_moe.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_