.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/post_process/algorithms/plot_gradient_sensitivity.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_post_process_algorithms_plot_gradient_sensitivity.py: Gradient Sensitivity ==================== In this example, we illustrate the use of the :class:`.GradientSensitivity` plot on the Sobieski's SSBJ problem. .. GENERATED FROM PYTHON SOURCE LINES 28-36 .. code-block:: Python from __future__ import annotations from gemseo import configure_logger from gemseo import create_discipline from gemseo import create_scenario from gemseo.problems.sobieski.core.design_space import SobieskiDesignSpace .. GENERATED FROM PYTHON SOURCE LINES 37-41 Import ------ The first step is to import some high-level functions and a method to get the design space. .. GENERATED FROM PYTHON SOURCE LINES 41-44 .. code-block:: Python configure_logger() .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 45-50 Description ----------- The :class:`.GradientSensitivity` post-processor builds histograms of derivatives of the objective and the constraints. .. GENERATED FROM PYTHON SOURCE LINES 52-56 Create disciplines ------------------ At this point, we instantiate the disciplines of Sobieski's SSBJ problem: Propulsion, Aerodynamics, Structure and Mission .. GENERATED FROM PYTHON SOURCE LINES 56-63 .. code-block:: Python disciplines = create_discipline([ "SobieskiPropulsion", "SobieskiAerodynamics", "SobieskiStructure", "SobieskiMission", ]) .. GENERATED FROM PYTHON SOURCE LINES 64-67 Create design space ------------------- We also create the :class:`.SobieskiDesignSpace`. .. GENERATED FROM PYTHON SOURCE LINES 67-69 .. code-block:: Python design_space = SobieskiDesignSpace() .. GENERATED FROM PYTHON SOURCE LINES 70-77 Create and execute scenario --------------------------- The next step is to build an MDO scenario in order to maximize the range, encoded ``"y_4"``, with respect to the design parameters, while satisfying the inequality constraints ``"g_1"``, ``"g_2"`` and ``"g_3"``. We can use the MDF formulation, the SLSQP optimization algorithm and a maximum number of iterations equal to 100. .. GENERATED FROM PYTHON SOURCE LINES 77-84 .. code-block:: Python scenario = create_scenario( disciplines, formulation="MDF", objective_name="y_4", maximize_objective=True, design_space=design_space, ) .. GENERATED FROM PYTHON SOURCE LINES 85-91 The differentiation method used by default is ``"user"``, which means that the gradient will be evaluated from the Jacobian defined in each discipline. However, some disciplines may not provide one, in that case, the gradient may be approximated with the techniques ``"finite_differences"`` or ``"complex_step"`` with the method :meth:`~.Scenario.set_differentiation_method`. The following line is shown as an example, it has no effect because it does not change the default method. .. GENERATED FROM PYTHON SOURCE LINES 91-96 .. code-block:: Python scenario.set_differentiation_method() for constraint in ["g_1", "g_2", "g_3"]: scenario.add_constraint(constraint, "ineq") scenario.execute({"algo": "SLSQP", "max_iter": 10}) .. rst-class:: sphx-glr-script-out .. code-block:: none INFO - 10:57:31: INFO - 10:57:31: *** Start MDOScenario execution *** INFO - 10:57:31: MDOScenario INFO - 10:57:31: Disciplines: SobieskiAerodynamics SobieskiMission SobieskiPropulsion SobieskiStructure INFO - 10:57:31: MDO formulation: MDF INFO - 10:57:31: Optimization problem: INFO - 10:57:31: minimize -y_4(x_shared, x_1, x_2, x_3) INFO - 10:57:31: with respect to x_1, x_2, x_3, x_shared INFO - 10:57:31: subject to constraints: INFO - 10:57:31: g_1(x_shared, x_1, x_2, x_3) <= 0.0 INFO - 10:57:31: g_2(x_shared, x_1, x_2, x_3) <= 0.0 INFO - 10:57:31: g_3(x_shared, x_1, x_2, x_3) <= 0.0 INFO - 10:57:31: over the design space: INFO - 10:57:31: +-------------+-------------+-------+-------------+-------+ INFO - 10:57:31: | Name | Lower bound | Value | Upper bound | Type | INFO - 10:57:31: +-------------+-------------+-------+-------------+-------+ INFO - 10:57:31: | x_shared[0] | 0.01 | 0.05 | 0.09 | float | INFO - 10:57:31: | x_shared[1] | 30000 | 45000 | 60000 | float | INFO - 10:57:31: | x_shared[2] | 1.4 | 1.6 | 1.8 | float | INFO - 10:57:31: | x_shared[3] | 2.5 | 5.5 | 8.5 | float | INFO - 10:57:31: | x_shared[4] | 40 | 55 | 70 | float | INFO - 10:57:31: | x_shared[5] | 500 | 1000 | 1500 | float | INFO - 10:57:31: | x_1[0] | 0.1 | 0.25 | 0.4 | float | INFO - 10:57:31: | x_1[1] | 0.75 | 1 | 1.25 | float | INFO - 10:57:31: | x_2 | 0.75 | 1 | 1.25 | float | INFO - 10:57:31: | x_3 | 0.1 | 0.5 | 1 | float | INFO - 10:57:31: +-------------+-------------+-------+-------------+-------+ INFO - 10:57:31: Solving optimization problem with algorithm SLSQP: INFO - 10:57:31: 10%|█ | 1/10 [00:00<00:01, 8.83 it/sec, obj=-536] INFO - 10:57:32: 20%|██ | 2/10 [00:00<00:01, 6.22 it/sec, obj=-2.12e+3] WARNING - 10:57:32: MDAJacobi has reached its maximum number of iterations but the normed residual 1.7130677857005655e-05 is still above the tolerance 1e-06. INFO - 10:57:32: 30%|███ | 3/10 [00:00<00:01, 5.26 it/sec, obj=-3.75e+3] INFO - 10:57:32: 40%|████ | 4/10 [00:00<00:01, 5.04 it/sec, obj=-3.96e+3] INFO - 10:57:32: 50%|█████ | 5/10 [00:01<00:01, 4.93 it/sec, obj=-3.96e+3] INFO - 10:57:32: Optimization result: INFO - 10:57:32: Optimizer info: INFO - 10:57:32: Status: 8 INFO - 10:57:32: Message: Positive directional derivative for linesearch INFO - 10:57:32: Number of calls to the objective function by the optimizer: 6 INFO - 10:57:32: Solution: INFO - 10:57:32: The solution is feasible. INFO - 10:57:32: Objective: -3963.408265187933 INFO - 10:57:32: Standardized constraints: INFO - 10:57:32: g_1 = [-0.01806104 -0.03334642 -0.04424946 -0.0518346 -0.05732607 -0.13720865 INFO - 10:57:32: -0.10279135] INFO - 10:57:32: g_2 = 3.333278582928756e-06 INFO - 10:57:32: g_3 = [-7.67181773e-01 -2.32818227e-01 8.30379541e-07 -1.83255000e-01] INFO - 10:57:32: Design space: INFO - 10:57:32: +-------------+-------------+---------------------+-------------+-------+ INFO - 10:57:32: | Name | Lower bound | Value | Upper bound | Type | INFO - 10:57:32: +-------------+-------------+---------------------+-------------+-------+ INFO - 10:57:32: | x_shared[0] | 0.01 | 0.06000083331964572 | 0.09 | float | INFO - 10:57:32: | x_shared[1] | 30000 | 60000 | 60000 | float | INFO - 10:57:32: | x_shared[2] | 1.4 | 1.4 | 1.8 | float | INFO - 10:57:32: | x_shared[3] | 2.5 | 2.5 | 8.5 | float | INFO - 10:57:32: | x_shared[4] | 40 | 70 | 70 | float | INFO - 10:57:32: | x_shared[5] | 500 | 1500 | 1500 | float | INFO - 10:57:32: | x_1[0] | 0.1 | 0.4 | 0.4 | float | INFO - 10:57:32: | x_1[1] | 0.75 | 0.75 | 1.25 | float | INFO - 10:57:32: | x_2 | 0.75 | 0.75 | 1.25 | float | INFO - 10:57:32: | x_3 | 0.1 | 0.1562448753887276 | 1 | float | INFO - 10:57:32: +-------------+-------------+---------------------+-------------+-------+ INFO - 10:57:32: *** End MDOScenario execution (time: 0:00:01.159448) *** {'max_iter': 10, 'algo': 'SLSQP'} .. GENERATED FROM PYTHON SOURCE LINES 97-118 Post-process scenario --------------------- Lastly, we post-process the scenario by means of the :class:`.GradientSensitivity` post-processor which builds histograms of derivatives of objective and constraints. The sensitivities shown in the plot are calculated with the gradient at the optimum or the least-non feasible point when the result is not feasible. One may choose any other iteration instead. .. note:: In some cases, the iteration that is being used to compute the sensitivities corresponds to a point for which the algorithm did not request the evaluation of the gradients, and a ``ValueError`` is raised. A way to avoid this issue is to set the option ``compute_missing_gradients`` of :class:`.GradientSensitivity` to ``True``, this way |g| will compute the gradients for the requested iteration if they are not available. .. warning:: Please note that this extra computation may be expensive depending on the :class:`.OptimizationProblem` defined by the user. Additionally, keep in mind that |g| cannot compute missing gradients for an :class:`.OptimizationProblem` that was imported from an HDF5 file. .. GENERATED FROM PYTHON SOURCE LINES 120-128 .. tip:: Each post-processing method requires different inputs and offers a variety of customization options. Use the high-level function :func:`.get_post_processing_options_schema` to print a table with the options for any post-processing algorithm. Or refer to our dedicated page: :ref:`gen_post_algos`. .. GENERATED FROM PYTHON SOURCE LINES 128-134 .. code-block:: Python scenario.post_process( "GradientSensitivity", compute_missing_gradients=True, save=False, show=True, ) .. image-sg:: /examples/post_process/algorithms/images/sphx_glr_plot_gradient_sensitivity_001.png :alt: Derivatives of objective and constraints with respect to design variables, -y_4, g_1_0, g_1_1, g_1_2, g_1_3, g_1_4, g_1_5, g_1_6, g_2, g_3_0, g_3_1, g_3_2, g_3_3 :srcset: /examples/post_process/algorithms/images/sphx_glr_plot_gradient_sensitivity_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/docs/checkouts/readthedocs.org/user_builds/gemseo/envs/5.2.0/lib/python3.9/site-packages/gemseo/post/gradient_sensitivity.py:214: UserWarning: set_ticklabels() should only be used with a fixed number of ticks, i.e. after set_ticks() or using a FixedLocator. axe.set_xticklabels(design_names, fontsize=font_size, rotation=rotation) .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 2.225 seconds) .. _sphx_glr_download_examples_post_process_algorithms_plot_gradient_sensitivity.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gradient_sensitivity.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gradient_sensitivity.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_