{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n# Gradient Sensitivity\n\nIn this example, we illustrate the use of the :class:`.GradientSensitivity`\nplot on the Sobieski's SSBJ problem.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "from __future__ import annotations\n\nfrom gemseo import configure_logger\nfrom gemseo import create_discipline\nfrom gemseo import create_scenario\nfrom gemseo.problems.sobieski.core.design_space import SobieskiDesignSpace" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Import\nThe first step is to import some high-level functions\nand a method to get the design space.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "configure_logger()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Description\n\nThe :class:`.GradientSensitivity` post-processor\nbuilds histograms of derivatives of the objective and the constraints.\n\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create disciplines\nAt this point, we instantiate the disciplines of Sobieski's SSBJ problem:\nPropulsion, Aerodynamics, Structure and Mission\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "disciplines = create_discipline([\n \"SobieskiPropulsion\",\n \"SobieskiAerodynamics\",\n \"SobieskiStructure\",\n \"SobieskiMission\",\n])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create design space\nWe also create the :class:`.SobieskiDesignSpace`.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "design_space = SobieskiDesignSpace()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create and execute scenario\nThe next step is to build an MDO scenario in order to maximize the range,\nencoded ``\"y_4\"``, with respect to the design parameters, while satisfying the\ninequality constraints ``\"g_1\"``, ``\"g_2\"`` and ``\"g_3\"``. We can use the MDF\nformulation, the SLSQP optimization algorithm and a maximum number of iterations\nequal to 100.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "scenario = create_scenario(\n disciplines,\n formulation=\"MDF\",\n objective_name=\"y_4\",\n maximize_objective=True,\n design_space=design_space,\n)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The differentiation method used by default is ``\"user\"``, which means that the\ngradient will be evaluated from the Jacobian defined in each discipline. However, some\ndisciplines may not provide one, in that case, the gradient may be approximated\nwith the techniques ``\"finite_differences\"`` or ``\"complex_step\"`` with the method\n:meth:`~.Scenario.set_differentiation_method`. The following line is shown as an\nexample, it has no effect because it does not change the default method.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "scenario.set_differentiation_method()\nfor constraint in [\"g_1\", \"g_2\", \"g_3\"]:\n scenario.add_constraint(constraint, \"ineq\")\nscenario.execute({\"algo\": \"SLSQP\", \"max_iter\": 10})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Post-process scenario\nLastly, we post-process the scenario by means of the :class:`.GradientSensitivity`\npost-processor which builds histograms of derivatives of objective and constraints.\nThe sensitivities shown in the plot are calculated with the gradient at the optimum\nor the least-non feasible point when the result is not feasible. One may choose any\nother iteration instead.\n\n
In some cases, the iteration that is being used to compute the sensitivities\n corresponds to a point for which the algorithm did not request the evaluation of\n the gradients, and a ``ValueError`` is raised. A way to avoid this issue is to set\n the option ``compute_missing_gradients`` of :class:`.GradientSensitivity` to\n ``True``, this way |g| will compute the gradients for the requested iteration if\n they are not available.
Please note that this extra computation may be expensive depending on the\n :class:`.OptimizationProblem` defined by the user. Additionally, keep in mind that\n |g| cannot compute missing gradients for an :class:`.OptimizationProblem` that was\n imported from an HDF5 file.