.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/mlearning/regression_model/plot_random_forest_regression.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_mlearning_regression_model_plot_random_forest_regression.py: Random forest ============= A :class:`.RandomForestRegressor` is a random forest model based on `scikit-learn `__. .. GENERATED FROM PYTHON SOURCE LINES 28-42 .. code-block:: Python from __future__ import annotations from matplotlib import pyplot as plt from numpy import array from gemseo import configure_logger from gemseo import create_design_space from gemseo import create_discipline from gemseo import sample_disciplines from gemseo.mlearning import create_regression_model configure_logger() .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 43-48 Problem ------- In this example, we represent the function :math:`f(x)=(6x-2)^2\sin(12x-4)` :cite:`forrester2008` by the :class:`.AnalyticDiscipline` .. GENERATED FROM PYTHON SOURCE LINES 48-53 .. code-block:: Python discipline = create_discipline( "AnalyticDiscipline", name="f", expressions={"y": "(6*x-2)**2*sin(12*x-4)"}, ) .. GENERATED FROM PYTHON SOURCE LINES 54-55 and seek to approximate it over the input space .. GENERATED FROM PYTHON SOURCE LINES 55-58 .. code-block:: Python input_space = create_design_space() input_space.add_variable("x", lower_bound=0.0, upper_bound=1.0) .. GENERATED FROM PYTHON SOURCE LINES 59-61 To do this, we create a training dataset with 6 equispaced points: .. GENERATED FROM PYTHON SOURCE LINES 61-65 .. code-block:: Python training_dataset = sample_disciplines( [discipline], input_space, "y", algo_name="PYDOE_FULLFACT", n_samples=6 ) .. rst-class:: sphx-glr-script-out .. code-block:: none WARNING - 15:34:25: No coupling in MDA, switching chain_linearize to True. INFO - 15:34:25: *** Start Sampling execution *** INFO - 15:34:25: Sampling INFO - 15:34:25: Disciplines: f INFO - 15:34:25: MDO formulation: MDF INFO - 15:34:25: Running the algorithm PYDOE_FULLFACT: INFO - 15:34:25: 17%|█▋ | 1/6 [00:00<00:00, 594.77 it/sec] INFO - 15:34:25: 33%|███▎ | 2/6 [00:00<00:00, 952.60 it/sec] INFO - 15:34:25: 50%|█████ | 3/6 [00:00<00:00, 1194.84 it/sec] INFO - 15:34:25: 67%|██████▋ | 4/6 [00:00<00:00, 1390.91 it/sec] INFO - 15:34:25: 83%|████████▎ | 5/6 [00:00<00:00, 1565.86 it/sec] INFO - 15:34:25: 100%|██████████| 6/6 [00:00<00:00, 1699.82 it/sec] INFO - 15:34:25: *** End Sampling execution (time: 0:00:00.004607) *** .. GENERATED FROM PYTHON SOURCE LINES 66-72 Basics ------ Training ~~~~~~~~ Then, we train an random forest regression model from these samples: .. GENERATED FROM PYTHON SOURCE LINES 72-75 .. code-block:: Python model = create_regression_model("RandomForestRegressor", training_dataset) model.learn() .. GENERATED FROM PYTHON SOURCE LINES 76-80 Prediction ~~~~~~~~~~ Once it is built, we can predict the output value of :math:`f` at a new input point: .. GENERATED FROM PYTHON SOURCE LINES 80-84 .. code-block:: Python input_value = {"x": array([0.65])} output_value = model.predict(input_value) output_value .. rst-class:: sphx-glr-script-out .. code-block:: none {'y': array([-0.88837697])} .. GENERATED FROM PYTHON SOURCE LINES 85-86 but cannot predict its Jacobian value: .. GENERATED FROM PYTHON SOURCE LINES 86-91 .. code-block:: Python try: model.predict_jacobian(input_value) except NotImplementedError: print("The derivatives are not available for RandomForestRegressor.") .. rst-class:: sphx-glr-script-out .. code-block:: none The derivatives are not available for RandomForestRegressor. .. GENERATED FROM PYTHON SOURCE LINES 92-96 Plotting ~~~~~~~~ You can see that the random forest model is pretty good on the left, but bad on the right: .. GENERATED FROM PYTHON SOURCE LINES 96-108 .. code-block:: Python test_dataset = sample_disciplines( [discipline], input_space, "y", algo_name="PYDOE_FULLFACT", n_samples=100 ) input_data = test_dataset.get_view(variable_names=model.input_names).to_numpy() reference_output_data = test_dataset.get_view(variable_names="y").to_numpy().ravel() predicted_output_data = model.predict(input_data).ravel() plt.plot(input_data.ravel(), reference_output_data, label="Reference") plt.plot(input_data.ravel(), predicted_output_data, label="Regression - Basics") plt.grid() plt.legend() plt.show() .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_001.png :alt: plot random forest regression :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none WARNING - 15:34:25: No coupling in MDA, switching chain_linearize to True. INFO - 15:34:25: *** Start Sampling execution *** INFO - 15:34:25: Sampling INFO - 15:34:25: Disciplines: f INFO - 15:34:25: MDO formulation: MDF INFO - 15:34:25: Running the algorithm PYDOE_FULLFACT: INFO - 15:34:25: 1%| | 1/100 [00:00<00:00, 2543.54 it/sec] INFO - 15:34:25: 2%|▏ | 2/100 [00:00<00:00, 2628.01 it/sec] INFO - 15:34:25: 3%|▎ | 3/100 [00:00<00:00, 2713.00 it/sec] INFO - 15:34:25: 4%|▍ | 4/100 [00:00<00:00, 2774.01 it/sec] INFO - 15:34:25: 5%|▌ | 5/100 [00:00<00:00, 2806.31 it/sec] INFO - 15:34:25: 6%|▌ | 6/100 [00:00<00:00, 2777.68 it/sec] INFO - 15:34:25: 7%|▋ | 7/100 [00:00<00:00, 2821.46 it/sec] INFO - 15:34:25: 8%|▊ | 8/100 [00:00<00:00, 2881.94 it/sec] INFO - 15:34:25: 9%|▉ | 9/100 [00:00<00:00, 2936.50 it/sec] INFO - 15:34:25: 10%|█ | 10/100 [00:00<00:00, 2950.62 it/sec] INFO - 15:34:25: 11%|█ | 11/100 [00:00<00:00, 2981.99 it/sec] INFO - 15:34:25: 12%|█▏ | 12/100 [00:00<00:00, 3010.27 it/sec] INFO - 15:34:25: 13%|█▎ | 13/100 [00:00<00:00, 3008.00 it/sec] INFO - 15:34:25: 14%|█▍ | 14/100 [00:00<00:00, 3027.13 it/sec] INFO - 15:34:25: 15%|█▌ | 15/100 [00:00<00:00, 3051.73 it/sec] INFO - 15:34:25: 16%|█▌ | 16/100 [00:00<00:00, 3061.12 it/sec] INFO - 15:34:25: 17%|█▋ | 17/100 [00:00<00:00, 3076.99 it/sec] INFO - 15:34:25: 18%|█▊ | 18/100 [00:00<00:00, 3091.37 it/sec] INFO - 15:34:25: 19%|█▉ | 19/100 [00:00<00:00, 3091.22 it/sec] INFO - 15:34:25: 20%|██ | 20/100 [00:00<00:00, 3102.98 it/sec] INFO - 15:34:25: 21%|██ | 21/100 [00:00<00:00, 3119.88 it/sec] INFO - 15:34:25: 22%|██▏ | 22/100 [00:00<00:00, 3134.97 it/sec] INFO - 15:34:25: 23%|██▎ | 23/100 [00:00<00:00, 3129.98 it/sec] INFO - 15:34:25: 24%|██▍ | 24/100 [00:00<00:00, 3137.59 it/sec] INFO - 15:34:25: 25%|██▌ | 25/100 [00:00<00:00, 3148.97 it/sec] INFO - 15:34:25: 26%|██▌ | 26/100 [00:00<00:00, 3148.06 it/sec] INFO - 15:34:25: 27%|██▋ | 27/100 [00:00<00:00, 3156.78 it/sec] INFO - 15:34:25: 28%|██▊ | 28/100 [00:00<00:00, 3165.68 it/sec] INFO - 15:34:25: 29%|██▉ | 29/100 [00:00<00:00, 3166.09 it/sec] INFO - 15:34:25: 30%|███ | 30/100 [00:00<00:00, 3169.74 it/sec] INFO - 15:34:25: 31%|███ | 31/100 [00:00<00:00, 3172.93 it/sec] INFO - 15:34:25: 32%|███▏ | 32/100 [00:00<00:00, 3182.17 it/sec] INFO - 15:34:25: 33%|███▎ | 33/100 [00:00<00:00, 3182.54 it/sec] INFO - 15:34:25: 34%|███▍ | 34/100 [00:00<00:00, 3116.19 it/sec] INFO - 15:34:25: 35%|███▌ | 35/100 [00:00<00:00, 3089.43 it/sec] INFO - 15:34:25: 36%|███▌ | 36/100 [00:00<00:00, 3089.66 it/sec] INFO - 15:34:25: 37%|███▋ | 37/100 [00:00<00:00, 3085.09 it/sec] INFO - 15:34:25: 38%|███▊ | 38/100 [00:00<00:00, 3076.43 it/sec] INFO - 15:34:25: 39%|███▉ | 39/100 [00:00<00:00, 3080.10 it/sec] INFO - 15:34:25: 40%|████ | 40/100 [00:00<00:00, 3087.11 it/sec] INFO - 15:34:25: 41%|████ | 41/100 [00:00<00:00, 3052.84 it/sec] INFO - 15:34:25: 42%|████▏ | 42/100 [00:00<00:00, 3050.77 it/sec] INFO - 15:34:25: 43%|████▎ | 43/100 [00:00<00:00, 3049.17 it/sec] INFO - 15:34:25: 44%|████▍ | 44/100 [00:00<00:00, 3050.00 it/sec] INFO - 15:34:25: 45%|████▌ | 45/100 [00:00<00:00, 3052.52 it/sec] INFO - 15:34:25: 46%|████▌ | 46/100 [00:00<00:00, 3060.71 it/sec] INFO - 15:34:25: 47%|████▋ | 47/100 [00:00<00:00, 3060.02 it/sec] INFO - 15:34:25: 48%|████▊ | 48/100 [00:00<00:00, 3064.42 it/sec] INFO - 15:34:25: 49%|████▉ | 49/100 [00:00<00:00, 3069.22 it/sec] INFO - 15:34:25: 50%|█████ | 50/100 [00:00<00:00, 3067.04 it/sec] INFO - 15:34:25: 51%|█████ | 51/100 [00:00<00:00, 3071.87 it/sec] INFO - 15:34:25: 52%|█████▏ | 52/100 [00:00<00:00, 3079.13 it/sec] INFO - 15:34:25: 53%|█████▎ | 53/100 [00:00<00:00, 3080.97 it/sec] INFO - 15:34:25: 54%|█████▍ | 54/100 [00:00<00:00, 3085.52 it/sec] INFO - 15:34:25: 55%|█████▌ | 55/100 [00:00<00:00, 3091.78 it/sec] INFO - 15:34:25: 56%|█████▌ | 56/100 [00:00<00:00, 3095.59 it/sec] INFO - 15:34:25: 57%|█████▋ | 57/100 [00:00<00:00, 3098.72 it/sec] INFO - 15:34:25: 58%|█████▊ | 58/100 [00:00<00:00, 3104.71 it/sec] INFO - 15:34:25: 59%|█████▉ | 59/100 [00:00<00:00, 3110.88 it/sec] INFO - 15:34:25: 60%|██████ | 60/100 [00:00<00:00, 3113.81 it/sec] INFO - 15:34:25: 61%|██████ | 61/100 [00:00<00:00, 3119.01 it/sec] INFO - 15:34:25: 62%|██████▏ | 62/100 [00:00<00:00, 3124.74 it/sec] INFO - 15:34:25: 63%|██████▎ | 63/100 [00:00<00:00, 3127.56 it/sec] INFO - 15:34:25: 64%|██████▍ | 64/100 [00:00<00:00, 3129.20 it/sec] INFO - 15:34:25: 65%|██████▌ | 65/100 [00:00<00:00, 3133.39 it/sec] INFO - 15:34:25: 66%|██████▌ | 66/100 [00:00<00:00, 3137.78 it/sec] INFO - 15:34:25: 67%|██████▋ | 67/100 [00:00<00:00, 3138.26 it/sec] INFO - 15:34:25: 68%|██████▊ | 68/100 [00:00<00:00, 3142.49 it/sec] INFO - 15:34:25: 69%|██████▉ | 69/100 [00:00<00:00, 3142.04 it/sec] INFO - 15:34:25: 70%|███████ | 70/100 [00:00<00:00, 3137.07 it/sec] INFO - 15:34:25: 71%|███████ | 71/100 [00:00<00:00, 3140.08 it/sec] INFO - 15:34:25: 72%|███████▏ | 72/100 [00:00<00:00, 3143.89 it/sec] INFO - 15:34:25: 73%|███████▎ | 73/100 [00:00<00:00, 3145.51 it/sec] INFO - 15:34:25: 74%|███████▍ | 74/100 [00:00<00:00, 3149.29 it/sec] INFO - 15:34:25: 75%|███████▌ | 75/100 [00:00<00:00, 3151.31 it/sec] INFO - 15:34:25: 76%|███████▌ | 76/100 [00:00<00:00, 3151.15 it/sec] INFO - 15:34:25: 77%|███████▋ | 77/100 [00:00<00:00, 3151.55 it/sec] INFO - 15:34:25: 78%|███████▊ | 78/100 [00:00<00:00, 3154.52 it/sec] INFO - 15:34:25: 79%|███████▉ | 79/100 [00:00<00:00, 3157.97 it/sec] INFO - 15:34:25: 80%|████████ | 80/100 [00:00<00:00, 3158.18 it/sec] INFO - 15:34:25: 81%|████████ | 81/100 [00:00<00:00, 3162.07 it/sec] INFO - 15:34:25: 82%|████████▏ | 82/100 [00:00<00:00, 3165.98 it/sec] INFO - 15:34:25: 83%|████████▎ | 83/100 [00:00<00:00, 3166.87 it/sec] INFO - 15:34:25: 84%|████████▍ | 84/100 [00:00<00:00, 3169.41 it/sec] INFO - 15:34:25: 85%|████████▌ | 85/100 [00:00<00:00, 3173.26 it/sec] INFO - 15:34:25: 86%|████████▌ | 86/100 [00:00<00:00, 3176.92 it/sec] INFO - 15:34:25: 87%|████████▋ | 87/100 [00:00<00:00, 3176.92 it/sec] INFO - 15:34:25: 88%|████████▊ | 88/100 [00:00<00:00, 3179.67 it/sec] INFO - 15:34:25: 89%|████████▉ | 89/100 [00:00<00:00, 3183.27 it/sec] INFO - 15:34:25: 90%|█████████ | 90/100 [00:00<00:00, 3182.86 it/sec] INFO - 15:34:25: 91%|█████████ | 91/100 [00:00<00:00, 3184.85 it/sec] INFO - 15:34:25: 92%|█████████▏| 92/100 [00:00<00:00, 3188.27 it/sec] INFO - 15:34:25: 93%|█████████▎| 93/100 [00:00<00:00, 3189.35 it/sec] INFO - 15:34:25: 94%|█████████▍| 94/100 [00:00<00:00, 3192.22 it/sec] INFO - 15:34:25: 95%|█████████▌| 95/100 [00:00<00:00, 3195.90 it/sec] INFO - 15:34:25: 96%|█████████▌| 96/100 [00:00<00:00, 3199.77 it/sec] INFO - 15:34:25: 97%|█████████▋| 97/100 [00:00<00:00, 3200.22 it/sec] INFO - 15:34:25: 98%|█████████▊| 98/100 [00:00<00:00, 3202.23 it/sec] INFO - 15:34:25: 99%|█████████▉| 99/100 [00:00<00:00, 3204.65 it/sec] INFO - 15:34:25: 100%|██████████| 100/100 [00:00<00:00, 3203.62 it/sec] INFO - 15:34:25: *** End Sampling execution (time: 0:00:00.032504) *** .. GENERATED FROM PYTHON SOURCE LINES 109-116 Settings -------- Number of estimators ~~~~~~~~~~~~~~~~~~~~ The main hyperparameter of random forest regression is the number of trees in the forest (default: 100). Here is a comparison when increasing and decreasing this number: .. GENERATED FROM PYTHON SOURCE LINES 116-134 .. code-block:: Python model = create_regression_model( "RandomForestRegressor", training_dataset, n_estimators=10 ) model.learn() predicted_output_data_1 = model.predict(input_data).ravel() model = create_regression_model( "RandomForestRegressor", training_dataset, n_estimators=1000 ) model.learn() predicted_output_data_2 = model.predict(input_data).ravel() plt.plot(input_data.ravel(), reference_output_data, label="Reference") plt.plot(input_data.ravel(), predicted_output_data, label="Regression - Basics") plt.plot(input_data.ravel(), predicted_output_data_1, label="Regression - 10 trees") plt.plot(input_data.ravel(), predicted_output_data_2, label="Regression - 1000 trees") plt.grid() plt.legend() plt.show() .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_002.png :alt: plot random forest regression :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 135-145 Others ------ The ``RandomForestRegressor`` class of scikit-learn has a lot of settings (`read more `__), and we have chosen to exhibit only ``n_estimators``. However, any argument of ``RandomForestRegressor`` can be set using the dictionary ``parameters``. For example, we can impose a minimum of two samples per leaf: .. GENERATED FROM PYTHON SOURCE LINES 145-156 .. code-block:: Python model = create_regression_model( "RandomForestRegressor", training_dataset, parameters={"min_samples_leaf": 2} ) model.learn() predicted_output_data_ = model.predict(input_data).ravel() plt.plot(input_data.ravel(), reference_output_data, label="Reference") plt.plot(input_data.ravel(), predicted_output_data, label="Regression - Basics") plt.plot(input_data.ravel(), predicted_output_data_, label="Regression - 2 samples") plt.grid() plt.legend() plt.show() .. image-sg:: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_003.png :alt: plot random forest regression :srcset: /examples/mlearning/regression_model/images/sphx_glr_plot_random_forest_regression_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 1.448 seconds) .. _sphx_glr_download_examples_mlearning_regression_model_plot_random_forest_regression.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_random_forest_regression.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_random_forest_regression.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_random_forest_regression.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_