gemseo.post.gradient_sensitivity_settings module#

Settings for post-processing.

Settings GradientSensitivity_Settings(*, save=True, show=False, file_path='', directory_path='', file_name='', file_extension='', fig_size=(10.0, 10.0), iteration=None, scale_gradients=False, compute_missing_gradients=False)[source]#

Bases: BasePostSettings

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:
  • save (bool) --

    By default it is set to True.

  • show (bool) --

    By default it is set to False.

  • file_path (Path | str) --

    By default it is set to "".

  • directory_path (Path | str) --

    By default it is set to "".

  • file_name (str) --

    By default it is set to "".

  • file_extension (str) --

    By default it is set to "".

  • fig_size (tuple[Annotated[float, Gt(gt=0)], Annotated[float, Gt(gt=0)]]) --

    By default it is set to (10.0, 10.0).

  • iteration (Annotated[int, Lt(lt=0)] | Annotated[int, Gt(gt=0)] | None)

  • scale_gradients (bool) --

    By default it is set to False.

  • compute_missing_gradients (bool) --

    By default it is set to False.

Return type:

None

compute_missing_gradients: bool = False#

Whether to compute the gradients at the selected iteration if they were not computed by the algorithm.nn.. warning::nActivating this option may add considerable computation time depending on the cost of the gradient evaluation. This option will not compute the gradients if the OptimizationProblem instance was imported from an HDF5 file. This option requires an OptimizationProblem with a gradient-based algorithm.

iteration: NegativeInt | PositiveInt | None = None#

The iteration to plot the sensitivities. Can use either positive or negative indexing, e.g. 5 for the 5-th iteration or -2 for the penultimate one. If None, use the iteration of the optimum.

scale_gradients: bool = False#

Whether to normalize each gradient w.r.t. the design variables.

model_post_init(context, /)#

We need to both initialize private attributes and call the user-defined model_post_init method.

Parameters:
  • self (BaseModel)

  • context (Any)

Return type:

None