Changelog#
All notable changes of this project will be documented here.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
Version 6.0.0 (2024-11-08)#
Added#
Optimization & DOE#
HessianHistory
is anOptPostProcessor
to visualize the history of the diagonal of the Hessian matrix of the objective. #463The optimizer
MultiStart
enables multi-start optimization, by combining an optimization algorithm and a DOE algorithm. #645EvaluationProblem
, of whichOptimizationProblem
is a subclass, allows to evaluateMDOFunction
s over aDesignSpace
and store the evaluations in aDatabase
. #678The drivers can now take the option
store_jacobian
(default:True
). When set toFalse
, the Jacobian matrices are not saved in the database. This reduces the RAM requirements for large optimization problems. #1094The maximum dimension of a design space to be logged can be passed as the optional argument
max_design_space_dimension_to_log
ofBaseDriverLibrary.execute
. Hence it can be passed to any DOE library or optimization library as the optionmax_design_space_dimension_to_log
. #1163The
MNBI
multi-objective algorithm can be used to refine a specific part of the Pareto front. #1171The
sample_disciplines
function allows to sample a system of disciplines based on an MDO formulation (default: MDF) and DOE settings (formerly ingemseo-mlearning
). #1180ProblemFunction
is anMDOFunction
wrapping anotherMDOFunction
. Itsfunc
andjac
methods call the corresponding methods of the underlyingMDOFunction
after pre-processing input value and before post-processing output value. Each function attached to anEvaluationProblem
is replaced by aProblemFunction
when callingEvaluationProblem.preprocess_functions
. Calls to the function can be counted via theenable_statistics
andn_calls
attributes. #1222BaseToleranceTester
is a new base class to check whether a tolerance criterion is met, with specific subclasses:DesignToleranceTester
,ObjectiveToleranceTester
andKKTConditionsTester
. #1224All the figures generated by
OptHistoryView
have a vertical red line representing the optimum iteration. #1236Add support of SciPy's version of the
COBYQA
algorithm. #1261SciPy's algorithms
L-BFGS-B
,NELDER-MEAD
,TNC
andSLSQP
support thestop_crit_n_x
option. #1265AlgorithmDescription
and its derived classes now include the class attributeAlgorithmDescription.settings
to store algorithm-specific settings. This is useful specially for algorithm libraries that wrap different algorithms in a single class. The following linear solvers have been added toScipyLinalgAlgos
:Conjugate Gradient (CG),
Conjugate Gradient Stabilized (CGS),
Generalized Conjugate Residual with Optimal Truncation (GCROT),
Transpose-Free Quasi-Minimum Residual (TFQMR).
The
ScipyLinprog
library now handles integer variables. #1450
MDO processes#
The MDA settings can now be passed at instantiation as a Pydantic model as well as key/value pairs. The following properties has been added to
BaseMDASolver
:acceleration_method
,over_relaxation_factor
.
The
JobSchedulerDisciplineWrapper
and thedeserialize_and_run
entry point implement_compute_jacobian
. #1191RemappingDiscipline
accepts an empty name mapping, in which case no remapping is applied. #1197BackupSettings
is a dataclass to pass the backup settings to functions relying onBaseScenario.set_optimization_history_backup
, e.g.sample_disciplines
andBaseSensitivityAnalysis.compute_samples
. #1204The
differentiated_input_names_substitute
argument ofBaseFormulation
,BaseMDOFormulation
,MDF
,IDF
,BiLevel
andDisciplinaryOpt
defines the names of the differentiated inputs; if empty, the formulation will consider all the inputs. It enables derivatives to be calculated with respect to discipline inputs that are not in the design space attached to the formulation. #1207ArrayBasedFunctionDiscipline
wraps a function whose both the unique argument and the return value are NumPy arrays. #1219Add the GEMSEO Web study GUI to the documentation. #1232
XDSMizer
can be applied to any discipline, and not just the scenarios. This is useful for seeing the dataflow and workflow of a multidisciplinary process, e.g.MDAJacobi
.generate_xdsm
can generate the XDSM of any discipline. #1257The
IDF
formulation now accepts options at instantiation for theMDAChain
executed at the begining of the process whenstart_at_equilibrium=True
. #1259BaseMDARoot
has a new argumentexecute_before_linearizing
to execute the disciplines before linearizing them. #1278
Surrogate models#
The package
gemseo.utils.metrics
include metrics to compare two quantities:The base class is
BaseMetric
and its factory isMetricFactory
.BaseCompositeMetric
is the base class for metrics relying on another metric.The
ElementWiseMetric
is a composite metric to compare two collections using an underlying metric; it returns a collection.DatasetMetric
is a composite metric to compare twoDataset
s row-wisely using an underlying metric; it returns aDataset
.The
MeanMetric
is a composite metric to compare two collections using an underlying metric; it returns an array.The
SquaredErrorMetric
is a composite metric returning the squared difference between two quantities.
The quality of an
MLRegressorAlgo
can be assessed by plotting its cross-validation error. #1122MAEMeasure
andMEMeasure
are respectively the mean absolute error and the maximum error to assess the quality of aBaseRegressor
(formerly ingemseo_mlearning
).GradientBoostingRegressor
wraps the scikit-learn's gradient boosting regressor (formerly ingemseo_mlearning
).MLPRegressor
wraps the scikit-learn's multilayer perceptron (MLP) (formerly ingemseo_mlearning
).OTGaussianProcessRegressor
wraps the OpenTURNS' Gaussian process (GP) regressor (formerly ingemseo_mlearning
).SVMRegressor
wraps the scikit-learn's support vector machine (SVM) regressor (formerly ingemseo_mlearning
).TPSRegressor
is a specificRBFRegressor
for thin plate spline regression (TPS) (formerly ingemseo_mlearning
).RegressorChain
is a composition ofBaseRegressor
objects trained sequentially, each one learning the relationship between the inputs and the error made by the previous one (formerly ingemseo_mlearning
). #1128Quality assessment:
ClassifierQualityFactory
is a factory of objects to assess the quality of classification algorithms.ClustererQualityFactory
is a factory of objects to assess the quality of clustering algorithms.BaseClassifierQuality
is the base class to assess the quality of classification algorithms.
OptimizationProblem.evaluation_counter
is anEvaluationCounter
to count the number of times a new iteration is stored inOptimizationProblem.database
. #1135OTGaussianProcessRegressor
has a new methodcompute_samples
to generate samples from the conditioned Gaussian process. #1140Resampler.plot
can be used to visualize the train-test partitions. #1156
Sensitivity analysis#
SobolAnalysis.compute_indices
can estimate the Sobol' indices more precisely using control variates. #1185Use the optional argument
n_replicates
ofSobolAnalysis.compute_indices
to set the number of replicates to estimate the confidence intervals by bootstrap. #1189BaseSensitivityAnalysis
and its subclasses (MorrisAnalysis
,SobolAnalysis
,CorrelationAnalysis
andHSICAnalysis
) handles backup and crash management with thebackup_settings
argument of thecompute_samples
method.MorrisDOE
is a DOE algorithm used byMorrisAnalysis
; it repeats an oat-at-a-time (OAT) sampling N times, starting from N different points selected from a DOE algorithm.OATDOE
is a DOE algorithm used byMorrisDOE
; it applies the oat-at-a-time (OAT) sampling strategy, given an initial point. #1213
Miscellaneous#
Support for Python 3.12 #1530
compare_dict_of_arrays
returnsFalse
in the case of variables with arrays of different sizes. #1049MDANewtonRaphson.NewtonLinearSolver
is the enumeration of linear solvers for the Newton method. #1084MDOLinearFunction.normalize
transforms a basicMDOLinearFunction
into anMDOLinearFunction
using a scaled input vector. #1104Sellar problem:
The local design variables and the coupling variables are vectors of dimension \(n\) (default: 1), instead of scalars.
Sellar2
has a new local design variable \(x_2\) which intervenes also in the objective expression (default: 0)The disciplines
Sellar1
andSellar2
have a coefficient \(k\) to control the strength of the coupling (default: 1).The coefficient
0.2
inSellar1
is now an input variable named \(\gamma\) (default: 0.2).The coefficient
3.16
inSellarSystem
is now an input variable named \(\alpha\) (default: 3.16).The coefficient
24.0
inSellarSystem
is now an input variable named \(\beta\) (default: 24.0).
The function
to_pickle
saves the pickled representation of an object on the disk.The function
from_pickle
loads the pickled representation of an object from the disk.BaseFormulation.variable_sizes
stores the sizes of both the design variables and the differentiated inputs. #1230String tools:
gemseo.utils.string_tools.get_name_and_component
returns a(name, component)
tuple from aname
or(name, component)
object.gemseo.utils.string_tools.convert_strings_to_iterable
returns an iterable of strings from a string or an iterable of strings.gemseo.utils.string_tools.filter_names
filters original names from a selection of names to keep by preserving their order.gemseo.utils.string_tools.get_variables_with_components
returns a collection of(name, component)
objects from aname
or(name, component)
object or a collection of such objects.
The function
import_database
can create either aDataset
or aDatabase
from the HDF5 file storing theDatabase
associated to anEvaluationProblem
.Database
has an optional argument calledinput_space
to define the input space to which the input vectors belong; by default, this input space includes a unique variable calledDatabase.DEFAULT_INPUT_NAME
. #1303The
execute
method of post-processing tools (seeBasePost
) returns a dictionary of matplotlib figures ofDatasetPlot
, depending on whether or not it is based on aDatasetPlot
. This allows interactive visualization in a web page when the HTML format is supported by theDatasetPlot
. This is the case ofBasicHistory
whose HTML version is based on the plotly library. When available, setfile_extension
to"html"
. #1308DesignSpace.to_scalar_variables()
creates a design space of scalar variables from a design space. #1333generate_coupling_graph
does not save the graph whenfile_path
is empty. This can be useful when we simply want to use the returnedGraphView
, to display it in a web page or a notebook for example. #1341
Fixed#
Optimization & DOE#
The
Database
is now correctly appended at each iteration when scalar and vectorial outputs are mixed. #1194DesignSpace.filter_dimensions
(formerlyDesignSpace.filter_dim
) updates the mappingDesignSpace.__names_to_indices
#1218CenteredDifferences.f_gradient
works whenCenteredDifferences
is not instantiated from a ``DesignSpace. #1253The algorithm
MNBI
correctly handles the optionnormalize_design_space
. #1255When the computation of the Lagrange multipliers crashes with
scipy.optimize.nnls
, fall back toscipy.optimize.lsq_linear
. #1340Algorithm
"Augmented_Lagrangian_order_1"
now updates all the Lagrange multipliers, even those which correspond to constraints that are inactive at the current iteration. #1342DesignSpace.normalize
not longer erroneously flags design variables with equal lower and upper bounds as not to be normalized. #1346
MDO processes#
When building the warm started chain it now considers the variables coming from the MDA 1 and the MDA 2. #1116
The method
MDA.plot_residual_history()
no longer crashes when theMDA
has been executed more than once and then_iterations
argument is smaller than the total amount of iterations. #1157The
DesignSpace
attached to theMDF
formulation can be reset. #1179MDOParallelChain
handles disciplines whose inputs are not NumPy arrays. #1209Remove the update of local data performed in the end of the
MDAGaussSeidel
. #1251When changing the residual scaling method of an
MDASequential
or andMDAChain
, the change is applied to all inner MDAs. #1280MDAChain
correctly runs withinitialize_defaults
set to True when some outputs of the chain are not coupling variables. #1281The
tolerance
andlinear_solver_tolerance
args are correctly passed to the main and sub-MDAs inMDAGSNewton
. #1294Local data in MDA are no longer modified via
self.local_data[key] = value
but rather using thestore_local_data
method to ensure proper handling of namespaces. #1309The XDSM generation through PDF format can now handle MDA within an
MDOParallelChain
with the correct workflow. #1441
Surrogate models#
Miscellaneous#
The
DependencyGraph
represents homonymous disciplines with specific nodes. #1264BaseAlgorithmLibrary
and its derived classes no longer allow users to pass settings that are not included in the validation model. In the past, unknown settings were allowed to pass up to the wrapped algorithm level, which was error prone. Theineq_tolerance
was not used for theScipy_MILP
algorithm, it is now done. #1450
Changed#
Database.get_function_history
raises aKeyError
when the database contains no value of the output, instead of returning an empty array.RemappingDiscipline
uses the same types for the grammar elements as the grammars of the underlying discipline. #1095IODataset
: the "group name, default variable name" pairs for theIODataset
are "i, inputs" and "o, outputs".OptimizationDataset
: the "group name, default variable name" pairs for theOptimizationDataset
are "d, designs", "o, observables", "f, objectives" and "c, constraints". #1275XDSM: the
pdf_cleanup
option removes all intermediate files, including .tikz and .tex files. #1263The convergence of MDAs is now checked after the execution of the disciplines instead of after the acceleration. This change affects the absolute value of the output coupling variables when running an MDA with acceleration methods. The difference has an order of magnitude at most equal to the MDA tolerance. The change should be harmless in most cases but could anyway have effects for numerically sensitive problems. #1251
The post-processing algorithms use the 10-based logarithm. The post-processing algorithms using matplotlib use the
tight_layout()
command. #593
Removed#
API Changes#
See Upgrading GEMSEO for more information.
Version 5.3.2 (2024-08-08)#
Added#
The
IDF
formulation now accepts options at instantiation for theMDAChain
executed at the begining of the process whenstart_at_equilibrium=True
. #1259
Fixed#
Remove the update of local data performed in the end of the
MDAGaussSeidel
. #1251BaseScenario.to_dataset
can export theDatabase
to aDataset
in presence of homonymous inputs and outputs.
Changed#
The convergence of MDAs is now checked after the execution of the disciplines instead of after the acceleration. This change affects the absolute value of the output coupling variables when running an MDA with acceleration methods. The difference has an order of magnitude at most equal to the MDA tolerance. The change should be harmless in most cases but could anyway have effects for numerically sensitive problems. #1251
Version 5.3.1 (2024-06-06)#
Fixed#
The method
OptimizationProblem.to_dataset
now considers the type of each variable. #1154GEMSEO no longer sets the
maxiter
option ofTNC
that does not exist butmaxfun
. #1181The
Database
export to an HDF5 file works in "append" mode and when storing data successively for the same input value. #1216
Version 5.3.0 (2024-03-28)#
Added#
The SciPy implementation of the
Nelder-Mead
gradient-free algorithm is now available. #875HSICAnalysis.compute_indices
also computes the p-values for screening purposes in two different ways: through permutations and from asymptotic formula. #992Added the modified Normal Boundary Intersection (mNBI) multi-objective optimization algorithm, for use with
"MNBI"
as algorithm name. Added the classMultiObjectiveOptimizationResult
to store and display results from optimization problems with more than one objective. Added the classParetoFront
to store the points of interest of aMultiObjectiveOptimizationResult
. Added thePoloni
analytical multi-objective optimization problem. Added theFonsecaFlemming
analytical multi-objective optimization problem. Added theViennet
analytical multi-objective optimization problem. #1012The method
DataConverter.is_continuous
can tell if a variable is continuous. #1066The
Dataset
class can now be created from aDataFrame
using the new class method:from_dataframe
. #1069The boolean attribute
DatasetPlot.grid
allows to add a grid to theDatasetPlot
. #1074HSICAnalysis.compute_indices
proposes two new sensitivity analysis (SA) types, namely conditional SA and target SA. #1078Lines
andBarPlot
have HTML-based interactive versions based on plotly. #1082The method
AbstractCache.get_all_entries
that returns all the entries, whatever the tolerance.The module
gemseo.typing
that contains common type annotations. #1090Dataset.from_csv
has a new attributefirst_column_as_index
, which permits to readcsv
files that contain the index as the first column. #1093MDOFunction
supports elementwise multiplication and division by NumPy arrays.Addition, substraction, multiplication and division of a function expecting normalized inputs with a function that does not raise
RuntimeError
.The function values \(f(x)\) passed to optimizers can optionally be scaled relative to \(|f(x_0)|\) where \(x_0\) is the current value of the
DesignSpace
. This functionality is enabled by passing a positive value \(\epsilon\) as the optionscaling_threshold
of any optimizer: the function values passed to the optimizer are then \(f(x) / \max\{ |f(x_0)|, \epsilon \}\). The purpose of \(\epsilon\) is to avoid division by a value close to zero. The (default) valueNone
for the optionscaling_threshold
disables the scaling. #1100DOELibrary.compute_doe
andcompute_doe
can use a variables space dimension instead of a variables space. #1102Each DOE algorithm available in GEMSEO has a new option named
"callbacks"
to pass a list of functions to be evaluated after each call toOptimizationProblem.evaluate_functions
. #1111Implement "hdf_node_path" for "opt_problem","design_space" and "database".
Allow "opt_problem" to be exported/imported to/from a node in a specified hdf file. #1119
The plotting methods of
SensitivityAnalysis
andStatistics
classes returnDatasetPlot
objects or figures from data visualization libraries, e.g. Matplotlib and Plotly.A Plotly figure can be passed to a Plotly plot. #1121
OptimizationProblem.get_all_functions
has a new argumentoriginal
to return the original functions given to the problem. #1126A property
figures
allows to retrieve the figures generated by aDatasetPlot
. #1130DriverLibrary.clear_listeners
removes the listeners added by theDriverLibrary
from theDatabase
attached to theOptimizationProblem
. #1134The
Seeder
class is a seed generator for features using random numbers, e.g.DOELibrary
; itsget_seed
method returns either the user seed or the initial seed incremented by the number of calls. #1148
Fixed#
The stratified DOE algorithms
OT_AXIAL
,OT_COMPOSITE
andOT_FACTORIAL
correctly support the argumentsn_samples
,centers
,dimension
andlevels
. #88The
MDODiscipline
can be linearized after execution when itscache_type
is set toMDODiscipline.CacheType.None
and both inputs and outputs arguments of linearize are empty. #804MDAJacobi
andMDAGaussSeidel
have now different XDSM representations which are in line with the convention proposed in [LM12].MDAChain
is not represented anymore in the XDSM. Add tests for the pdf generation of XDSMs. Bugfixes for XDSM pdf generation. #1062The expression
LinearComposition.expr
is now correct. #1063Non-continuous variables can no longer be differentiated. #1066
The upper bound KS aggregation function is really called when aggregating constraints in an
OptimizationProblem
#1075The user no longer has to provide an initial design point to solve an optimization problem with gradient approximation. #1076
The method
Scenario.set_optimization_history_backup()
no longer causes the execution to crash at the first iteration when the Scenario includes equality or inequality constraints. #1089The
DirectoryCreator
can now consider non-existing root directory while usingDirectoryNamingMethod.NUMBERED
. #1097The missing closing parenthesis in the expression of
Rosenbrock
is no longer missing.Addition, substraction, multiplication and division of functions expecting normalized inputs yield functions expecting normalized inputs. #1100
CustomDOE.compute_doe
no longer raises an error and works correctly. #1103The axes generated by
EmpiricalStatistics.plot_cdf
are no longer switched. #1105The BiLevel formulation can now be warm started even when the MDA1 does not exist (case of weakly coupled disciplines). #1107
The attribute
ParameterSpace.distributions
is correctly updated when renaming a random variable. #1108OptimizationProblem.database
is not used whenuse_database
isFalse
in the case of aDOELibrary
. #1110Dataset.to_dict_of_arrays
no longer raises anAttributeError
when bothby_entry
andby_group
areTrue
and works properly. #1112When
function_calls
isTrue
,OptimizationProblem.reset
resets the number of calls of the original functions. #1126DesignSpace.rename_variable
can be applied to a variable without value. #1127Requesting an optimized LHS with size 1 raises an exception instead of a freeze. #1133
Database
cannot store the same listener several times. #1134The
Alternate2Delta
method now handles degenerated (ill-conditioned) least squares problems. In this case, the method now returns the iterate without transformations. #1137Dataset.add_group
works correctly whenvariable_names
defines a single variable. #1138The transformers passed to an
MLAlgo
are correctly applied when thefit_transformers
argument of thelearn
method isFalse
. #1146The
MDOChain
Jacobian is made reproducible, making the sum of composite derivative terms in an order that does not depend on the code execution. #1150The
PydanticGrammar
was not able to validateDisciplineData
objects. #1153
Changed#
The methods
OptimizationProblem.from_hdf()
andOptimizationProblem.to_hdf()
no longer log messages when they are called. The methodDatabase.to_ggobi()
no longer logs messages when it is called. #579The option
disp
of theSciPy
algorithms shall now be passed as aboolean
instead of aninteger
. #875The method
Scenario.set_optimization_history_backup()
now starts generating plots only after the first two iterations have been computed. TheOptHistoryView
plots created byScenario.set_optimization_history_backup()
with the optiongenerate_opt_plot
are no longer updated at eachDatabase.store()
, only at each new iteration. #1089API change:
AbstractCache._{INPUTS,OUTPUTS,JACOBIAN}_GROUP
has been replaced byAbstractCache.Group
. #1090Methods
execute
andlinearize
ofgemseo.problems.sobieski.core.structure.SobieskiStructure
catch theValueError
raised by the computation of the logarithm of a non-positive weight ratio. Methodexecute
returnsnumpy.nan
for the mass term. #1101It is now possible to solve
MDA
instances that include non-numeric couplings (weak or strong), typically strings or arrays of string. The non-numeric couplings are automatically filtered during the numerical solution of theMDA
. A warning message is shown in the log atDEBUG
level with the variables that were filtered. #1124Database.clear_listeners
returns the listeners after removing them from theDatabase
. #1134OptimizationProblem.objective = mdo_function
setsmdo_function.f_type
tomdo_function.FunctionType.OBJ
; no need to do it by hand anymore. #1141The argument
uniform_distribution_name
ofIshigamiProblem
andIshigamiSpace
allows to use a uniform distribution from a library other than SciPy, e.g. OpenTURNS. #1143API change:
SEED
moved togemseo.utils.seeder
. #1148
Removed#
Support for reStructuredText docstring format.
The function
get_default_option_values
; useinspect.get_callable_argument_defaults(cls.__init__)
instead ofget_default_option_values(cls)
. #1059
Version 5.2.0 (2023-12-20)#
Added#
Setting
file_format="html"
inDatasetPlot.execute
saves on the disk and/or displays in a web browser a plotly-based interactive plot.DatasetPlot.DEFAULT_PLOT_ENGINE
is set toPlotEngine.MATPLOTLIB
; this is the default plot engine used byDatasetPlot
.DatasetPlot.FILE_FORMATS_TO_PLOT_ENGINES
maps the file formats to the plot engines to override the default plot engine. #181Add
OptimizationProblem.get_last_point
method to get the last point of an optimization problem. #285The disciplines
Concatenater
,LinearCombination
andSplitter
now have sparse Jacobians. #423The method
EmpiricalStatistics.plot_barplot
generates a boxplot for each variable. The methodEmpiricalStatistics.plot_cdf
draws the cumulative distribution function for each variable. The methodEmpiricalStatistics.plot_pdf
draws the probability density function for each variable. #438MLRegressorQualityViewer
proposes various methods to plot the quality of anMLRegressionAlgo`
.DatasetPlot.execute
can use a file name suffix.SurrogateDiscipline.get_quality_viewer
returns aMLRegressorQualityViewer
. #666ScatterMatrix
can set any option of the pandasscatter_matrix
function.ScatterMatrix
can add trend curves on the scatter plots, with either the enumerationScatterMatrix.Trend
or a custom fitting technique.Scatter
can add a trend curve, with either the enumerationScatter.Trend
or a custom fitting technique. #724ScenarioResult
is a new concept attached to aScenario
. This concept enables to post-process more specifically the results of a given scenario. In particular, theScenarioResult
can be derived in order to implement dedicated post-treatments depending on the formulation.OptimizationResult.from_optimization_problem
creates anOptimizationResult
from anOptimizationProblem
.BaseFormulation.DEFAULT_SCENARIO_RESULT_CLASS_NAME
is the name of the defaultOptimizationResult
class to be used with the given formulation.ScenarioResult
stores the result of aScenario
from aScenario
or an HDF5 file.BiLevelScenarioResult
is aScenarioResult
to store the result of aScenario
using aBiLevel
formulation.ScenarioResultFactory
is a factory ofScenarioResult
.Scenario.get_result
returns the result of the execution of theScenario
as aScenarioResult
.create_scenario_result
stores the result of aScenario
from aScenario
or an HDF5 file.
The
LinearCombination
discipline now has a sparse Jacobian. #809The
normalize
option ofBasicHistory
scales the data between 0 and 1 before plotting them. #841The type of the coupling variables is no longer restricted to NumPy arrays thanks to data converters attached to grammars. #849
gemseo.mlearning.sampling
is a new package with resampling techniques, such asCrossValidation
andBootstrap
.MLAlgo.resampling_results
stores the resampling results; a resampling result is defined by aResampler
, the machine learning algorithms generated during the resampling stage and the associated predictions. The methods offered byMLQualityMeasure
to estimate a quality measure by resampling have a new argument calledstore_resampling_result
to store the resampling results and reuse them to estimate another quality measure faster. #856SciPyDOE
is a newDOELibrary
based on SciPy, with five algorithms: crude Monte Carlo, Halton sequence, Sobol' sequence, Latin hypercube sampling and Poisson disk sampling. #857When third-party libraries do not handle sparse Jacobians, a preprocessing step is used to convert them as dense NumPy arrays. #899
R2Measure.evaluate_bootstrap
is now implemented. #914Add diagrams in the documentation to illustrate the architecture and usage of ODEProblem. #922
MDA can now handle disciplines with matrix-free Jacobians. To define a matrix-free Jacobian, the user must fill in the
MDODiscipline.jac
dictionary withJacobianOperator
overloading the_matvec
and_rmatvec
methods to respectively implement the matrix-vector and transposed matrix-vector product. #940The
SimplerGrammar
is a grammar based on element names only.SimplerGrammar` is even simpler than ``SimpleGrammar
which considers both names and types. #949HSICAnalysis
is a newSensitivityAnalysis
based on the Hilbert-Schmidt independence criterion (HSIC). #951Add the Augmented Lagrangian Algorithm implementation. #959
Support for Python 3.11. #962
Optimization problems with inequality constraints can be reformulated with only bounds and equality constraints and additional slack variables thanks to the public method:
OptimizationProblem.get_reformulated_problem_with_slack_variables.
#963The subtitle of the graph generated by
SobolAnalysis.plot
includes the standard deviation of the output of interest in addition to its variance. #965OTDistributionFactory
is aDistributionFactory
limited toOTDistribution
objects.SPDistributionFactory
is aDistributionFactory
limited toSPDistribution
objects. Thebase_class_name
attribute ofget_available_distributions
can limit the probability distributions to a specific library, e.g."OTDistribution"
for OpenTURNS and"SPDistribution"
for SciPy. #972The
use_one_line_progress_bar
driver option allows to display only one iteration of the progress bar at a time. #977OTWeibullDistribution
is the OpenTURNS-based Weibull distribution.SPWeibullDistribution
is the SciPy-based Weibull distribution. #980MDAChain
has an option to initialize the default inputs by creating aMDOInitializationChain
at first execution. #981The upper bound KS function is added to the aggregation functions. The upper bound KS function is an offset of the lower bound KS function already implemented. #985
CenteredDifferences
Approximation mode is now supported for jacobian computation. This can be used to calculateMDODiscipline
andMDOFunctions
jacobians setting the jacobian approximation mode as for the Finite Differences and the Complex Step schemes. This is a second order approach that employs twice points but as a second order accuracy with respect to the Finite Difference scheme. When calculating a Centered Difference on one of the two bounds of the Design Space, the Finite Difference scheme is used instead. #987The class
SobieskiDesignSpace
deriving fromDesignSpace
can be used in the Sobieski's SSBJ problem. It offers new filtering methods, namelyfilter_coupling_variables
andfilter_design_variables
. #1003The
MDODiscipline
can flag linear relationships between inputs and outputs. This enables theFunctionFromDiscipline
generated from theseMDODiscipline
to be instances ofLinearMDOFunction
. AnOptimizationProblem
is now by default a linear problem unless a non-linear objective or constraint is added to the optimization problem. #1008The following methods now have an option
as_dict
to request the return values as dictionaries of NumPy arrays instead of straight NumPy arrays:DesignSpace.get_lower_bounds
,DesignSpace.get_upper_bounds
,OptimizationProblem.get_x0_normalized
andDriverLibrary.get_x0_and_bounds_vects
. #1010gemseo.SEED
is the default seed used by GEMSEO for random number generators. #1011HiGHS solvers for linear programming interfaced by SciPy are now available. #1016
Augmented Lagrangian can now pass some of the constraints to the sub-problem and deal with the rest of them thanks to the
sub_problem_constraints
option. #1026An example on the usage of the
MDODiscipline.check_jacobian
method was added to the documentation. Three derivative approximation methods are discussed: finite differences, centered differences and complex step. #1039The
TaylorDiscipline
class can be used to create the first-order Taylor polynomial of anMDODiscipline
at a specific expansion point. #1042The following machine learning algorithms have an argument
random_state
to control the generation of random numbers:RandomForestClassifier
,SVMClassifier
,GaussianMixture
,KMeans
,GaussianProcessRegressor
,LinearRegressor
andRandomForestRegressor
. Use an integer for reproducible results (default behavior). #1044BaseAlgoFactory.create
initializes the grammar of algorithm options when it is called with an algorithm name. #1048
Fixed#
There is no longer overlap between learning and test samples when using a cross-validation technique to estimate the quality measure of a machine learning algorithm. #915
Security vulnerability when calling
subprocess.run
withshell=True
. #948Fixed bug on
LagrangeMultipliers
evaluation when bound constraints are activated on variables which have only one bound. #964The iteration rate is displayed with appropriate units in the progress bar. #973
AnalyticDiscipline
casts SymPy outputs to appropriate NumPy data types (as opposed to systematically casting tofloat64
). #974AnalyticDiscipline
no longer systematically casts inputs tofloat
. #976MDODiscipline.set_cache_policy
can useMDODiscipline.CacheType.NONE
ascache_type
value to remove the cache of theMDODiscipline
. #978The normalization methods of
DesignSpace
do no longer emit aRuntimeWarning
about a division by zero when the lower and upper bounds are equal. #1002The types used with
PydanticGrammar.update_from_types
withmerge=True
are taken into account. #1006DesignSpace.dict_to_array
returns anndarray
whose attributedtype
matches the "commondtype
" of the values of itsdict
argumentdesign_values
corresponding to the keys passed in its argumentvariables_names
. So far, thedtype
was erroneously based on all the values ofdesign_values
. #1019DisciplineData
with nested dictionary can now be serialized withjson
. #1025Full-factorial design of experiments: the actual number of samples computed from the maximum number of samples and the dimension of the design space is now robust to numerical precision issues. #1028
DOELibrary.execute
raises aValueError
when a component of theDesignSpace
is unbounded and theDesignSpace
is not aParameterSpace
.DOELibrary.compute_doe
raises aValueError
whenunit_sampling
isFalse
, a component of the design space is unbounded and theDesignSpace
is not aParameterSpace
. #1029OptimizationProblem.get_violation_criteria
no longer considers the non-violated components of the equality constraints when calculating the violation measure. #1032A
JSONGrammar
using namespaces can be serialized correctly. #1041RadarChart
displays the constraints at iterationi
wheniteration=i
. #1054
Changed#
API:
The class
RunFolderManager
is renamedDirectoryGenerator
.The class
FoldersIter
is renamedIdentifiers
.The signature of the class
DirectoryGenerator
has changed:folders_iter
is replaced byidentifiers
output_folder_basepath
is replaced byroot_directory
The subpackage
gemseo.mlearning.data_formatters
includes theDataFormatters
used by the learning and prediction methods of the machine learning algorithms. #933The argument
use_shell
of the disciplineDiscFromExe
is no longer taken into account, executable are now always executed without shell. #948The existing KS function aggregation is renamed as
lower_bound_KS
. #985The log of the
ProgressBar
no longer displays the initialization of the progress bar. #988The
samples
option of the algorithmCustomDOE
can be a 2D-array shaped as(n_samples, total_variable_size)
, a dictionary shaped as{variable_name: variable_samples, ...}
wherevariable_samples
is a 2D-array shaped as(n_samples, variable_size)
or ann_samples
-length list shaped as[{variable_name: variable_sample, ...}, ...]
wherevariable_sample
is a 1D-array shaped as(variable_size, )
. #999PydanticGrammar
have been updated to support Pydantic v2. For such grammars, NumPy ndarrays shall be typed withgemseo.core.grammars.pydantic_ndarray.NDArrayPydantic
instead of the standardndarray
orNDArray
based of annotations. #1017The example on how to do a Pareto Front on the Binh Korn problem now uses a
BiLevel
formulation instead of anMDOScenarioAdapter
manually embedded into aDOEScenario
. #1040ParameterSpace.__str__
no longer displays the current values, the bounds and the variable types when all the variables are uncertain. #1046
Removed#
Support for Python 3.8. #962
Version 5.1.1 (2023-10-04)#
Security#
Upgrade the dependency pillow
to mitigate a vulnerability.
Version 5.1.0 (2023-10-02)#
Added#
The argument
scenario_log_level
ofMDOScenarioAdapter
allows to change the level of the root logger during the execution of its scenario.The argument
sub_scenarios_log_level
ofBiLevel
allows to change the level of the root logger during the execution of its sub-scenarios. #370DesignSpace
has a pretty HTML representation. #504The method
add_random_vector()
adds a random vector with independent components to aParameterSpace
from a probability distribution name and parameters. These parameters can be set component-wise. #551The high-level function
create_dataset
returns an emptyDataset
by default with a default name. #721OptimizationResult
has new fieldsx_0_as_dict
andx_opt_as_dict
bounding the names of the design variables to their initial and optimal values. #775Enable the possibility of caching sparse Jacobian with cache type HDF5Cache. #783
Acceleration methods for MDAs are defined in dedicated classes inheriting from
SequenceTransformer
.Available sequence transformers are:
The alternate 2-δ method:
Alternate2Delta
.The alternate δ² method:
AlternateDeltaSquared
.The secante method:
Secante
.The Aitken method:
Aitken
.The minimum polynomial method:
MinimumPolynomial
.The over-relaxation:
OverRelaxation
.
The values of the constraints can be passed to method
OptimizationProblem.get_number_of_unsatisfied_constraints.
#802RegressorQualityFactory
is a factory ofMLErrorMeasure
.SurrogateDiscipline.get_error_measure
returns anMLErrorMeasure
to assess the quality of aSurrogateDiscipline
; use one of its evaluation methods to compute it, e.g.evaluate_learn
to compute a learning error. #822The
DatasetFactory
is a factory ofDataset
.The high-level function
create_dataset
can return any type ofDataset
. #823Dataset
has a string propertysummary
returning some information, e.g. number of entries, number of variable identifiers, ... #824MLAlgo.__repr__
returns the same asMLAlgo.__str__
before this change andMLAlgo.__str__
does not overloadMLAlgo.__repr__
. #826The method
Dataset.to_dict_of_arrays
can break down the result by entry with the boolean argumentby_entry
whose default value isFalse
. #828Added Scipy MILP solver wrapper. #833
DesignSpace.get_variables_indexes
features a new optional argumentuse_design_space_order
to switch the order of the indexes between the design space order and the user order. #850ScalableProblem.create_quadratic_programming_problem
handles the case where uncertain vectors are added in the coupling equations. #863MDODisciplineAdapterGenerator
can use a dictionary of variable sizes at instantiation. #870The multi-processing start method (spawn or fork) can now be chosen. #885
Acceleration methods and over-relaxation are now available for
MDAJacobi
,MDAGaussSeidel
andMDANewtonRaphson
. They are configured at initialization via theacceleration_method
andover_relaxation_factor
and can be modified afterward via the attributesMDA.acceleration_method
andMDA.over_relaxation_factor
.Available acceleration methods are:
Alternate2Delta
,AlternateDeltaSquared
,Aitken
,Secant
,MinimumPolynomial
,
CouplingStudyAnalysis
has a new methodgenerate_coupling_graph
.The CLI
gemseo-study
generates the condensed and full coupling graphs as PDF files. #910The
check_disciplines_consistency
function checks if two disciplines compute the same output and raises an error or logs a warning message if this is the case.MDOCouplingStructure
logs a message withWARNING
level if two disciplines compute the same output. #912The default value of an input variable of a
LinearCombination
is zero. #913BaseFactory.create
supports positional arguments. #918The algorithms of a
DriverLibrary
have a new option"log_problem"
(default:True
). Set it toFalse
so as not to log the sections related to an optimization problem, namely the problem definition, the optimization result and the evolution of the objective value. This can be useful when aDOEScenario
is used as a pure sampling scenario. #925SensitivityAnalysis.plot_bar
andSensitivityAnalysis.plot_radar
have new argumentssort
andsorting_output
to sort the uncertain variables by decreasing order of the sensitivity indices associated with a sorting output variable.DatasetPlot
has a new argumentxtick_rotation
to set the rotation angle of the x-ticks for a better readability when the number of ticks is important. #930SensitivityAnalysis.to_dataset
stores the second-order Sobol' indices in the dictionaryDataset.misc
with the key"second"
. #936The string representation of a
ComposedDistribution
uses both the string representations of the marginals and the string representation of the copula.The string representation of a
Distribution
uses both the string representation of its parameters and its dimension when the latter is greater than 1. #937The default value of the argument
outputs
of the methodsplot_bar
andplot_radar
ofSensitivityAnalysis
is()
. In this case, theSensitivityAnalysis
uses all the outputs. #941N2HTML
can use any sized default inputs (NumPy arrays, lists, tuples, ...) to deduce the size of the input variables. #945
Fixed#
Fix the MDA residual scaling strategy based on sub-residual norms. #957
An XDSM can now take into account several levels of nested scenarios as well as nested
MDA
. An XDSM with a nested Scenario can also take into account more complex formulations thanDisciplinaryOpt
, such asMDF
. #687The properties of the
JSONGrammar
created byBaseFactory.get_options_grammar
are no longer required. #772If
time_vector
is not user-specified, then it is generated by the solver. As such, the array generated by the solver belongs in theODEResult
. #778Fix plot at the end of the Van der Pol tutorial illustrating an
ODEProblem
. #806The high-level function
create_dataset
returns a baseDataset
by default. #823SurrogateDiscipline.__str__
is less verbose by inheriting fromMDODiscipline
; useSurrogateDiscipline.__repr__
instead of the olderSurrogateDiscipline.__repr__
. #837OptHistoryView
can be executed withvariable_names=None
to explicitly display all the design variables.The variable names specified with the argument
variable_names
ofOptHistoryView
are correctly considered.OptimizationProblem.from_hdf
setspb_type
anddifferentiation_method
as string.OptHistoryView
,ObjConstrHist
andConstraintsHistory
display a limited number of iterations on the x-axis to make it more readable by avoiding xtick overlay.DesignSpace
has a new propertynames_to_indices
defining the design vector indices associated with variable names. #838execute_post
can post-process aPath
. #846The MDA chain can change at once the
max_mda_iter
of all its MDAs. The behaviour of themax_mda_iter
of this class has been changed to do so. #848The methods
to_dataset
buildDataset
objects in one go instead of adding variables one by one. #852CorrelationAnalysis
andSobolAnalysis
use the input names in the order provided by theParameterSpace
. #853The
RunFolderManager
can now work with a non-emptyoutput_folder_basepath
when usingfolders_iter = FoldersIter.NUMBERED
. Their name can be different from a number. #865The argument
output_names
ofMorrisAnalysis
works properly again. #866The argument
n_samples
passed toMorrisAnalysis
is correctly taken into account. #869DOELibrary
works when the design variables have no default value. #870The generation of XDSM diagrams for MDA looping over MDOScenarios. #879
BarPlot
handles now correctly aDataset
whose number of rows is higher than the number of variables. #880The DOE algorithms consider the optional seed when it is equal to 0 and use the driver's one when it is missing. #886
PCERegressor
now handles multidimensional random input variables. #895get_all_inputs
andget_all_outputs
return sorted names and so are now deterministic. #901OptHistoryView
no longer logs a warning when post-processing an optimization problem whose objective gradient history is empty. #902The string representation of an
MDOFunction
is now correct even after several sign changes. #917The sampling phase of a
SensitivityAnalysis
no longer reproduces the full log of theDOEScenario
. Only the disciplines, the MDO formulation and the progress bar are considered. #925The
Correlations
plot now labels its subplots correctly when the constraints of the optimization problem include an offset. #931The string representation of a
Distribution
no longer sorts the parameters. #935SobolAnalysis
can export the indices to aDataset
, even when the second-order Sobol' indices are computed. #936One can no longer add two random variables with the same name in a
ParameterSpace
. #938SensitivityAnalysis.plot_bar
andSensitivityAnalysis.plot_radar
use all the outputs when the argumentoutputs
is empty (e.g.None
,""
or()
). #941A
DesignSpace
containing a design variable without current value can be used to extend anotherDesignSpace
. #947Security vulnerability when calling
subprocess.run
withshell=True
. #948
Changed#
Distribution
: the default value ofvariable
is"x"
; same forOTDistribution
,SPDistribution
and their sub-classes.SPDistribution
: the default values ofinterfaced_distribution
andparameters
areuniform
and{}
.OTDistribution
: the default values ofinterfaced_distribution
andparameters
areUniform
and()
. #551The high-level function
create_dataset
raises aValueError
when the file has a wrong extension. #721The performance of
MDANewtonRaphson
was improved. #791The classes
KMeans
use"auto"
as default value for the argumentn_init
of the scikit-learn'sKMeans
class. #825output_names
was added toMDOFunction.DICT_REPR_ATTR
in order for it to be exported when saving to an hdf file. #860OptimizationProblem.minimize_objective
is now a property that changes the sign of the objective function if needed. #909The name of the
MDOFunction
resulting from the sum (resp. subtraction, multiplication, division) of twoMDOFunction
s named"f"
and"g"
is"[f+g]"
(resp."[f-g]"
,"[f*g]"
,"[f/g]"
).The name of the
MDOFunction
defined as the opposite of theMDOFunction
named"f"
is-f
.In the expression of an
MDOFunction
resulting from the multiplication or division ofMDOFunction
s, the expression of an operand is now grouped with round parentheses if this operand is a sum or subtraction. For example, for"f(x) = 1+x"
and"g(x) = x"
the resulting expression forf*g
is"[f*g](x) = (1+x)*x"
.The expression of the
MDOFunction
defined as the opposite of itself is-(expr)
. #917Renamed
MLQualityMeasure.evaluate_learn
toMLQualityMeasure.compute_learning_measure
.Renamed
MLQualityMeasure.evaluate_test
toMLQualityMeasure.compute_test_measure
.Renamed
MLQualityMeasure.evaluate_kfolds
toMLQualityMeasure.compute_cross_validation_measure
.Renamed
MLQualityMeasure.evaluate_loo
toMLQualityMeasure.compute_leave_one_out_measure
.Renamed
MLQualityMeasure.evaluate_bootstrap
toMLQualityMeasure.compute_bootstrap_measure
. #920The argument
use_shell
of the disciplineDiscFromExe
is no longer taken into account, executable are now always executed without shell. #948
Version 5.0.1 (2023-09-07)#
Added#
The MDAJacobi performance and memory usage was improved. #882
Fixed#
The MDAJacobi executions are now deterministic. The MDAJacobi m2d acceleration is deactivated when the least square problem is not well solved. #882
Version 5.0.0 (2023-06-02)#
Main GEMSEO.API breaking changes#
The high-level functions defined in
gemseo.api
have been moved togemseo
.Features have been extracted from GEMSEO and are now available in the form of
plugins
:gemseo.algos.opt.lib_pdfo
has been moved to gemseo-pdfo, a GEMSEO plugin for the PDFO library,gemseo.algos.opt.lib_pseven
has been moved to gemseo-pseven, a GEMSEO plugin for the pSeven library,gemseo.wrappers.matlab
has been moved to gemseo-matlab, a GEMSEO plugin for MATLAB,gemseo.wrappers.template_grammar_editor
has been moved to gemseo-template-editor-gui, a GUI to create input and output file templates forDiscFromExe
.
Added#
PCERegressor
has new arguments:use_quadrature
to estimate the coefficients by quadrature rule or least-squares regression.use_lars
to get a sparse PCE with the LARS algorithm in the case of the least-squares regression.use_cleaning
andcleaning_options
to apply a cleaning strategy removing the non-significant terms.hyperbolic_parameter
to truncate the PCE before training.
The argument
scale
ofPCA
allows to scale the data before reducing their dimension. #743
GradientSensitivity
plots the positive derivatives in red and the negative ones in blue for easy reading. #725TopologyView
allows to visualize the solution of a 2D topology optimization problem. #739ConstraintsHistory
uses horizontal black dashed lines for tolerance. #664Animation
is a newOptPostProcessor
to generate an animated GIF from aOptPostProcessor
. #740
JSchedulerDisciplineWrapper
can submit the execution of disciplines to a HPC job scheduler. #613MDODiscipline
has now a virtual execution mode; when active,MDODiscipline.execute
returns itsMDODiscipline.default_outputs
, whatever the inputs. #558Improve the computation of
MDA
residuals with the following new strategies:each sub-residual is scaled by the corresponding initial norm,
each component is scaled by the corresponding initial component,
the Euclidean norm of the component-wise division by initial residual scaled by the problem size.
OTComposedDistribution
can consider any copula offered by OpenTURNS. #655Scenario.xdsmize
returns aXDSM
; itsXDSM.visualize
method displays the XDSM in a web browser; this object has also a HTML view. #564Add a new grammar type based on Pydantic:
PydanticGrammar
. This new grammar is still experimental and subject to changes, use with cautions. #436XLSStudyParser
has a new argumenthas_scenario
whose default value isTrue
; ifFalse
, the sheetScenario
is not required.CouplingStudyAnalysis
allows to generate an N2 diagram from an XLS file defining the disciplines in terms of input and output names.MDOStudyAnalysis
allows to generate an N2 diagram and an XDSM from an XLS file defining an MDO problem in terms of disciplines, formulation, objective, constraint and design variables. #696JSONGrammar
can validatePathLike
objects. #759Enable sparse matrices in the utils.comparisons module. #779
The method
MDODiscipline._init_jacobian
now supports sparse matrices.
Stopping options
"max_time"
and"stop_crit_n_x"
can now be used with the global optimizers of SciPy ("DIFFERENTIAL_EVOLUTION"
,"DUAL_ANNEALING"
and"SHGO"
). #663Add exterior penalty approach to reformulate
OptimizationProblem
with constraints into one without constraints. #581Documentation: the required parameters of optimization, DOE and linear solver algorithms are documented in dedicated sections. #680
The
MDOLinearFunction
expression can be passed as an argument to the instantiation. This can be useful for large numbers of inputs or outputs to avoid long computation times for the expression string. #697Enable sparse coefficients for
MDOLinearFunction
. #756
SobolAnalysis
provides theSobolAnalysis.output_variances
andSobolAnalysis.output_standard_deviations
.SobolAnalysis.unscale_indices
allows to unscale the Sobol' indices usingSobolAnalysis.output_variances
orSobolAnalysis.output_standard_deviations
.SobolAnalysis.plot
now displays the variance of the output variable in the title of the graph. #671CorrelationAnalysis
proposes two new sensitivity methods, namely Kendall rank correlation coefficients (CorrelationAnalysis.kendall
) and squared standard regression coefficients (CorrelationAnalysis.ssrc
). #654
Factory for algorithms (
BaseAlgoFactory
) can cache the algorithm libraries to provide speedup. #522When
keep_opt_history=True
, the databases of aMDOScenarioAdapter
can be exported in HDF5 files. #607The argument
use_deep_copy
has been added to the constructor ofMDOParallelChain
class. This controls the use of deepcopy when runningMDOParallelChain
. By default this is set toFalse
, as a performance improvement has been observed in use cases with a large number of disciplines. The old behaviour of using a deep copy ofMDOParallelChain.local_data
can be enabled by setting this option toTrue
. This may be necessary in some rare combination ofMDOParallelChain
and other disciplines that directly modify theMDODiscipline.input_data
. #527Added a new
RunFolderManager
to generate unique run directory names forDiscFromExe
, either as successive integers or as UUID's. #648ScenarioAdapter
is aFactory
ofMDOScenarioAdapter
. #684A new
MDOWarmStartedChain
allows users to warm start some inputs of the chain with the output values of the previous run. #665The method
Dataset.to_dict_of_arrays
converts aDataset
into a dictionary of NumPy arrays indexed by variable names or group names. #793
Fixed#
MinMaxScaler
andStandardScaler
handle constant data withoutRuntimeWarning
. #719
The different kinds of
OptPostProcessor
displaying iteration numbers start counting at 1. #601The option
fig_size
passed toOptPostProcessor.execute
is now taken into account. #641The subplots of
ConstraintsHistory
use their own y-limits. #656The visualization
ParallelCoordinates
uses the names of the design variables defined in theDesignSpace
instead of default ones. #675
MDODiscipline.linearize
withcompute_all_jacobians=False
(default value) computes the Jacobians only for the inputs and outputs defined withMDODiscipline.add_differentiated_inputs
andMDODiscipline.add_differentiated_outputs
if any; otherwise, it returns an empty dictionary; ifcompute_all_jacobians=True
, it considers all the inputs and outputs. #644The bug concerning the linearization of a
MDOScenarioAdapter
including disciplines that depends both only onMDOScenarioAdapter
inputs and that are linearized in theMDOScenarioAdapter._run
method is solved. Tests concerning this behavior where added. #651AutoPyDiscipline
can wrap a Python function with multiline return statements. #661Modify the computation of total derivatives in the presence of state variables to avoid unnecessary calculations. #686
Modify the default linear solver calling sequence to prevent the use of the
splu
function on SciPyLinearOperator
objects. #691Fix Jacobian of
MDOChain
includingSplitter
disciplines. #764Corrected typing issues that caused an exception to be raised when a custom parser was passed to the
DiscFromExe
at instantiation. #767The method
MDODiscipline._init_jacobian
whenfill_missing_key=True
now creates the missing keys. #782It is now possible to pass a custom
name
to theXLSDiscipline
at instantiation. #788get_available_mdas
no longer returns the abstract classMDA
. #795
OptimizationProblem.to_dataset
uses the order of the design variables given by theParameterSpace
to build theDataset
. #626Database.get_complete_history
raises aValueError
when asking for a non-existent function. #670The DOE algorithm
OT_FACTORIAL
handles correctly the tuple of parameters (levels
,centers
); this DOE algorithm does not usen_samples
. The DOE algorithmOT_FULLFACT
handles correctly the use ofn_samples
as well as the use of the parameterslevels
; this DOE algorithm can use eithern_samples
orlevels
. #676The required properties are now available in the grammars of the DOE algorithms. #680
The stopping criteria for the objective function variation are only activated if the objective value is stored in the database in the last iterations. #692
The
GradientApproximator
and its subclasses no longer include closures preventing serialization. #700A constraint aggregation
MDOFunction
is now capable of dealing with complexndarray
inputs. #716Fix
OptimizationProblem.is_mono_objective
that returned wrong values when the objective had oneoutvars
but multidimensional. #734Fix the behavior of
DesignSpace.filter_dim
method for list of indices containing more than one index. #746
SensitivityAnalysis.to_dataset
works correctly with several methods and the returnedDataset
can be exported to aDataFrame
. #640OTDistribution
can now truncate a probability distribution on both sides. #660
The method
OptProblem.constraint_names
is now built on fly from the constraints. This fixes the issue of the updating of the constraint names when the constraints are modified, as it is the case with the aggregation of constraints. #669Factory
considers the base class as an available class when it is not abstract. #685Serialization of paths in disciplines attributes and local_data in multi OS. #711
Changed#
JSONGrammar
no longer merge the definition of a property with the dictionary-likeupdate
methods. Now the usual behavior of a dictionary will be used such that the definition of a property is overwritten. The previous behavior can be used by passing the argumentmerge = True
. #708CorrelationAnalysis
no longer proposes the signed standard regression coefficients (SSRC), as it has been removed fromopenturns
. #654Splitter
,Concatenater
,DensityFilter
, andMaterialModelInterpolation
disciplines use sparse Jacobians. #745The minimum value of the seed used by a DOE algorithm is 0. #727
Parametric
gemseo.problems.scalable.parametric.scalable_problem.ScalableProblem
:The configuration of the scalable disciplines is done with
ScalableDisciplineSettings
.The method
gemseo.problems.scalable.parametric.scalable_problem.ScalableProblem.create_quadratic_programming_problem
returns the corresponding quadratic programming (QP) problem as anOptimizationProblem
.The argument
alpha
(default: 0.5) defines the share of feasible design space.
API changes#
See Upgrading GEMSEO for more information.
Version 4.3.0 (2023-02-09)#
Added#
Statistics.compute_joint_probability
computes the joint probability of the components of random variables whileStatistics.compute_probability
computes their marginal ones. #542MLErrorMeasure
can split the multi-output measures according to the output names. #544SobolAnalysis.compute_indices
has a new argument to change the level of the confidence intervals. #599MDOInitializationChain
can compute the input data for a MDA from incomplete default_inputs of the disciplines. #610Add a new execution status for disciplines: "STATUS_LINEARIZE" when the discipline is performing the linearization. #612
ConstraintsHistory
:One can add one point per iteration on the blue line (default behavior).
The line style can be changed (dashed line by default).
The types of the constraint are displayed.
The equality constraints are plotted with the
OptPostProcessor.eq_cstr_cmap
.
Users can now choose whether the
OptimizationProblem.current_iter
should be set to 0 before the execution of anOptimizationProblem
passing the algo optionreset_iteration_counters
. This is useful to complete the execution of aScenario
from a backup file without exceeding the requestedmax_iter
orn_samples
. #636
Fixed#
HDF5Cache.hdf_node_name
returns the name of the node of the HDF file in which the data are cached. #583The histories of the objective and constraints generated by
OptHistoryView
no longer return an extra iteration. #591The histories of the constraints and diagonal of the Hessian matrix generated by
OptHistoryView
use the scientific notation. #592ObjConstrHist
correctly manages the objectives to maximize. #594Statistics.n_variables
no longer corresponds to the number of variables in theStatistics.dataset
but to the number of variables considered byStatistics
.ParametricStatistics
correctly handles variables with dimension greater than one.ParametricStatistics.compute_a_value
uses 0.99 as coverage level and 0.95 as confidence level. #597The input data provided to the discipline by a DOE did not match the type defined in the design space. #606
The cache of a self-coupled discipline cannot be exported to a dataset. #608
The
ConstraintsHistory
draws the vertical line at the right position when the constraint is satisfied at the final iteration. #616Fixed remaining time unit inconsistency in progress bar. #617
The attribute
fig_size
ofsave_show_figure
impacts the figure whenshow
isTrue
. #618Transformer
handles both 1D and 2D arrays. #624SobolAnalysis
no longer depends on the order of the variables in theParameterSpace
. #626ParametricStatistics.plot_criteria
plots the confidence level on the right subplot when the fitting criterion is a statistical test. #627CorrelationAnalysis.sort_parameters
uses the rule "The higher the absolute correlation coefficient the better". #628Fix the parallel execution and the serialization of LinearCombination discipline. #638
Fix the parallel execution and the serialization of ConstraintAggregation discipline. #642
Changed#
Statistics.compute_probability
computes one probability per component of the variables. #542The history of the diagonal of the Hessian matrix generated by
OptHistoryView
displays the names of the design variables on the y-axis. #595QuadApprox
now displays the names of the design variables. #596The methods
SensitivityAnalysis.plot_bar
andSensitivityAnalysis.plot_comparison
ofSensitivityAnalysis
uses two decimal places by default for a better readability. #603BarPlot
uses a grid for a better readability.SobolAnalysis.plot
uses a grid for a better readability.MorrisAnalysis.plot
uses a grid for a better readability. #604Dataset.export_to_dataframe
can either sort the columns by group, name and component, or only by group and component. #622OptimizationProblem.export_to_dataset
uses the order of the design variables given by theParameterSpace
to build theDataset
. #626
Version 4.2.0 (2022-12-22)#
Added#
Add a new property to
MatlabDiscipline
in order to get access to theMatlabEngine
instance attribute. #536Independent
MDA
in aMDAChain
can be run in parallel. #587The
MDAChain
has now an option to run the independent branches of the process in parallel.The Ishigami use case to illustrate and benchmark UQ techniques (
IshigamiFunction
,IshigamiSpace
,IshigamiProblem
andIshigamiDiscipline
). #517An
MDODiscipline
can now be composed ofMDODiscipline.disciplines
. #520SobolAnalysis
can compute theSobolAnalysis.second_order_indices
.SobolAnalysis
uses asymptotic distributions by default to compute the confidence intervals. #524PCERegressor
has a new attributePCERegressor.second_sobol_indices
. #525The
DistributionFactory
has two new methods:DistributionFactory.create_marginal_distribution
andDistributionFactory.create_composed_distribution
. #526SobieskiProblem
has a new attributeUSE_ORIGINAL_DESIGN_VARIABLES_ORDER
to order the design variables of theSobieskiProblem.design_space
according to their original order ("x_shared"
,"x_1"
,"x_2"
and"x_3"
) rather than the gemseo one ("x_shared"
,"x_1"
,"x_2"
and"x_3"
), asSobieskiProblem
andSobieskiBase
are based on this original order. #550
Fixed#
Fix the XDSM workflow of a sequential sequence within a parallel sequence. #586
Factory
no longer considers abstract classes. #280When the
DOELibrary.execute
is called twice with different DOEs, the functions attached to theOptimizationProblem
are correctly sampled during the second execution and the results correctly stored in theDatabase
. #435A
ParameterSpace
prevents the mixing of probability distributions coming from different libraries. #495MinMaxScaler
andStandardScaler
can now deal with constant variables. #512The options
use_database
,round_ints
andnormalized_design_space
passed toDriverLib.execute
are no longer ignored. #537OptimizationProblem
casts the complex numbers to real when exporting itsOptimizationProblem.database
to aDataset
. #546PCERegressor
computes the Sobol' indices for all the output dimensions. #557Fixed a bug in
HDF5FileSingleton
that caused theHDF5Cache
to crash when writing data that included arrays of string. #559OptProblem.get_violation_criteria
is inf for constraints with NaN values. #561Fixed a bug in the iterations progress bar, that displayed inconsistent objective function and duration values. #562
NormFunction
andNormDBFunction
now use theMDOFunction.special_repr
of the originalMDOFunction
. #568DOEScenario
andMDOScenario
can be serialized after an execution. Added missing_ATTR_TO_SERIALIZE
toMDOChain
andMDOScenarioAdapter
. #578
Changed#
Since version 4.1.0, when using a DOE, an integer variable passed to a discipline is casted to a floating point. The previous behavior will be restored in version 4.2.1.
The batches requested by pSeven are evaluated in parallel. #207
The
LagrangeMultipliers
of a non-solvedOptimizationProblem
can be approximated. The errors raised byLagrangeMultipliers
are now raised byPostOptimalAnalysis
. #372The jacobian computation in
MDOChain
now uses the minimal jacobians of the disciplines instead of theforce_all
option of the disciplines linearization. #531The jacobian computation in
MDA
now uses the minimal jacobians of the disciplines instead of all couplings for the disciplines linearization. #483The
Scenario.set_differentiation_method
now casts automatically all float default inputs of the disciplines in its formulation to complex when usingOptimizationProblem.COMPLEX_STEP
and setting the optioncast_default_inputs_to_complex
toTrue
. TheScenario.set_differentiation_method
now casts automatically the current value of theDesignSpace
to complex when usingOptimizationProblem.COMPLEX_STEP
. TheMDODiscipline.disciplines
is now a property that returns the protected attributeMDODiscipline._disciplines
. #520The methods
MDODiscipline.add_differentiated_inputs
andMDODiscipline.add_differentiated_outputs
now ignore inputs or outputs that are not numeric. #548MLQualityMeasure
usesTrue
as the default value forfit_transformers
, which means that theTransformer
instances attached to the assessedMLAlgo
are re-trained on each training subset of the cross-validation partition.MLQualityMeasure.evaluate_kfolds
usesTrue
as default value forrandomize
, which means that the learning samples attached to the assessedMLAlgo
are shuffled before building the cross-validation partition. #553
Version 4.1.0 (2022-10-25)#
Added#
MakeFunction
has a new optional argumentnames_to_sizes
defining the sizes of the input variables. #252DesignSpace.initialize_missing_current_values
sets the missing current design values to default ones.OptimizationLibrary
initializes the missing design values to default ones before execution. #299Boxplot
is a newDatasetPlot
to create boxplots from aDataset
. #320Scenario
offers an keyword argumentmaximize_objective
, previously passed implicitly with**formulation_options
. #350A stopping criterion based on KKT condition residual can now be used for all gradient-based solvers. #372
The static N2 chart represents the self-coupled disciplines with blue diagonal blocks. The dynamic N2 chart represents the self-coupled disciplines with colored diagonal blocks. #396
SimpleCache
can be exported to aDataset
. #404A warning message is logged when an attempt is made to add an observable twice to an
OptimizationProblem
and the addition is cancelled. #409A
SensitivityAnalysis
can be saved on the disk (useSensitivityAnalysis.save
andSensitivityAnalysis.load
). ASensitivityAnalysis
can be loaded from the disk with the functionload_sensitivity_analysis
. #417The
PCERegressor
has new properties related to the PCE output, namely itsPCERegressor.mean
,PCERegressor.covariance
,PCERegressor.variance
andPCERegressor.standard_deviation
. #428Timer
can be used as a context manager to measure the time spent within awith
statement. #431Computation of KKT criteria is made optional. #440
Bievel processes now store the local optimization history of sub-scenarios in ScenarioAdapters. #441
pretty_str
converts an object into an readable string by usingstr
. #442The functions
create_linear_approximation
andcreate_quadratic_approximation
computes the first- and second-order Taylor polynomials of anMDOFunction
. #451The KKT norm is added to database when computed. #457
MDAs now output the norm of residuals at the end of its execution. #460
pretty_str
andpretty_repr
sort the elements of collections by default. #469The module
gemseo.algos.doe.quality
offers features to assess the quality of a DOE:DOEQuality
assesses the quality of a DOE fromDOEMeasures
; the qualities can be compared with logical operators.compute_phip_criterion
computes the\varphi_p
space-filling criterion.compute_mindist_criterion
computes the minimum-distance space-filling criterion.compute_discrepancy
computes different discrepancy criteria.
Fixed#
NLOPT_COBYLA and NLOPT_BOBYQA algorithms may end prematurely in the simplex construction phase, caused by an non-exposed and too small default value of the
stop_crit_n_x
algorithm option. #307The MDANewton MDA does not have anymore a Jacobi step interleaved in-between each Newton step. #400
The
AnalyticDiscipline.default_inputs
do not share anymore the same Numpy array. #406The Lagrange Multipliers computation is fixed for design points close to local optima. #408
gemseo-template-grammar-editor
now works with both pyside6 and pyside2. #410DesignSpace.read_from_txt
can read a CSV file with a current value set atNone
. #411The argument
message
passed toDriverLib.init_iter_observer
and defining the iteration prefix of theProgressBar
works again; its default value is"..."
. #416The signatures of
MorrisAnalysis
,CorrelationAnalysis
andSobolAnalysis
are now consistent withSensitivityAnalysis
. #424When using a unique process, the observables can now be evaluated as many times as the number of calls to
DOELibrary.execute
. #425The
DOELibrary.seed
of theDOELibrary
is used by default and increments at each execution; pass the integer optionseed
toDOELibrary.execute
to use another one, the time of this execution. #426DesignSpace.get_current_value
correctly handles the order of thevariable_names
in the case of NumPy array outputs. #433The
SimpleCache
no longer fails when caching an output that is not a Numpy array. #444The first iteration of a
MDA
was not shown in red withMDA.plot_residual_history`
. #455The self-organizing map post-processing (
SOM
) has been fixed, caused by a regression. #465The couplings variable order, used in the
MDA
class for the adjoint matrix assembly, was not deterministic. #472A multidisciplinary system with a self-coupled discipline can be represented correctly by a coupling graph. #506
Changed#
The
LoggingContext
uses the root logger as default value oflogger
. #421The
GradientSensitivity
post-processor now includes an option to compute the gradients at the selected iteration to avoid a crash if they are missing. #434pretty_repr
converts an object into an unambiguous string by usingrepr
; usepretty_str
for a readable string. #442A global multi-processing manager is now used, this improves the performance of multiprocessing on Windows platforms. #445
The graphs produced by
OptHistoryView
use the sameOptHistoryView.xlabel
. #449Database.notify_store_listener
takes a design vector as input and when not provided the last iteration design vector is employed. The KKT criterion when kkt tolerances are provided is computed at each new storage. #457
Version 4.0.1 (2022-08-04)#
Added#
Fixed#
The MDANewton MDA does not have anymore a Jacobi step interleaved in-between each Newton step. #400
The
AnalyticDiscipline.default_inputs
do not share anymore the same Numpy array. #406The Lagrange Multipliers computation is fixed for design points close to local optima. #408
gemseo-template-grammar-editor
now works with both pyside6 and pyside2. #410
Version 4.0.0 (2022-07-28)#
Added#
Concatenater
can now scale the inputs before concatenating them.LinearCombination
is a new discipline computing the weighted sum of its inputs.Splitter
is a new discipline splitting whose outputs are subsets of its unique input. #316The transform module in machine learning now features two power transforms:
BoxCox
andYeoJohnson
. #341A
MDODiscipline
can now use a pandas DataFrame via itsMDODiscipline.local_data
. #58Grammars can add namespaces to prefix the element names. #70
Disciplines and functions, with tests, for the resolution of 2D Topology Optimization problem by the SIMP approach were added in gemseo.problems.topo_opt. In the documentation, 3 examples covering L-Shape, Short Cantilever and MBB structures are also added. #128
A
TransformerFactory
. #154The
gemseo.post.radar_chart.RadarChart
post-processor plots the constraints at optimum by default and provides access to the database elements from either the first or last index. #159OptimizationResult
can store the optimum index. #161An
OptimizationProblem
can be reset either fully or partially (database, current iteration, current design point, number of function calls or functions preprocessing).Database.clear
can reset the iteration counter. #188The
Database
attached to aScenario
can be cleared before running the driver. #193The variables of a
DesignSpace
can be renamed. #204The optimization history can be exported to a
Dataset
from aScenario
. #209A
DatasetPlot
can associate labels to the handled variables for a more meaningful display. #212The bounds of the parameter length scales of a
GaussianProcessRegressor
can be defined at instantiation. #228Observables included in the exported HDF file. #230
ScatterMatrix
can plot a limited number of variables. #236The Sobieski's SSBJ use case can now be used with physical variable names. #242
The coupled adjoint can now account for disciplines with state residuals. #245
Randomized cross-validation can now use a seed for the sake of reproducibility. #246
The
DriverLib
now checks if the optimization or DOE algorithm handles integer variables. #247An
MDODiscipline
can automatically detect JSON grammar files from a user directory. #253Statistics
can now estimate a margin. #255Observables can now be derived when the driver option
eval_obs_jac
isTrue
(default:False
). #256ZvsXY
can add series of points above the surface. #259The number and positions of levels of a
ZvsXY
orSurfaces
can be changed. #262ZvsXY
orSurfaces
can use either isolines or filled surfaces. #263A
MDOFunction
can now be divided by anotherMDOFunction
or a number. #267An
MLAlgo
cannot fit the transformers during the learning stage. #273The
KLSVD
wrapped from OpenTURNS can now use the stochastic algorithms. #274The lower or upper half of the
ScatterMatrix
can be hidden. #301A
Scenario
can use a standardized objective in logs andOptimizationResult
. #306Statistics
can compute the coefficient of variation. #325Lines
can use an abscissa variable and markers. #328The user can now define a
OTDiracDistribution
with OpenTURNS. #329It is now possible to select the number of processes on which to run an
IDF
formulation using the optionn_processes
. #369
Fixed#
Ensure that a nested
MDAChain
is not detected as a self-coupled discipline. #138The method
MDOCouplingStructure.plot_n2_chart
no longer crashes when the provided disciplines have no couplings. #174The broken link to the GEMSEO logo used in the D3.js-based N2 chart is now repaired. #184
An
XLSDiscipline
no longer crashes when called using multi-threading. #186The option
mutation
of the"DIFFERENTIAL_EVOLUTION"
algorithm now checks the correct expected type. #191SensitivityAnalysis
can plot a field with an output name longer than one character. #194Fixed a typo in the
monitoring
section of the documentation referring to the functioncreate_gantt_chart
ascreate_gannt
. #196DOELibrary
untransforms unit samples properly in the case of random variables. #197The string representations of the functions of an
OptimizationProblem
imported from an HDF file do not have bytes problems anymore. #201Fix normalization/unnormalization of functions and disciplines that only contain integer variables. #219
Factory.get_options_grammar
provides the same content in the returned grammar and the dumped one. #220Dataset
uses pandas to read CSV files more efficiently. #221Missing function and gradient values are now replaced with
numpy.NaN
when exporting aDatabase
to aDataset
. #223The method
OptimizationProblem.get_data_by_names
no longer crashes when bothas_dict
andfilter_feasible
are set to True. #226MorrisAnalysis
can again handle multidimensional outputs. #237The
XLSDiscipline
test run no longer leaves zombie processes in the background after the execution is finished. #238An
MDAJacobi
inside aDOEScenario
no longer causes a crash when a sample raises aValueError
. #239AnalyticDiscipline with absolute value can now be derived. #240
The function
hash_data_dict
returns deterministic hash values, fixing a bug introduced in GEMSEO 3.2.1. #251LagrangeMultipliers
are ensured to be non negative. #261A
MLQualityMeasure
can now be applied to aMLAlgo
built from a subset of the input names. #265The given value in
DesignSpace.add_variable
is now cast to the propervar_type
. #278The
DisciplineJacApprox.compute_approx_jac
method now returns the correct Jacobian when filtering by indices. With this fix, theMDODiscipline.check_jacobian
method no longer crashes when using indices. #308An integer design variable can be added with a lower or upper bound explicitly defined as +/-inf. #311
A
PCERegressor
can now be deepcopied before or after the training stage. #340A
DOEScenario
can now be serialized. #358An
AnalyticDiscipline
can now be serialized. #359N2JSON
now works when a coupling variable has no default value, and displays"n/a"
as variable dimension.N2JSON
now works when the default value of a coupling variable is an unsized object, e.g.array(1)
. #388The observables are now computed in parallel when executing a
DOEScenario
using more than one process. #391
Changed#
Fixed Lagrange Multipliers computation for equality active constraints. #345
The
normalize
argument ofOptimizationProblem.preprocess_functions
is now namedis_function_input_normalized
. #22The
gemseo.post.radar_chart.RadarChart
post-processor uses all the constraints by default. #159Updating a dictionary of NumPy arrays from a complex array no longer converts the complex numbers to the original data type except if required. #177
The D3.js-based N2 chart can now display the GEMSEO logo offline. #184
The default number of components used by a
DimensionReduction
transformer is based on data and depends on the related technique. #244Classes deriving from
MDODiscipline
inherits the input and output grammar files of their first parent. #258The parameters of a
DatasetPlot
are now passed at instantiation. #260An
MLQualityMeasure
no longer trains anMLAlgo
already trained. #264Accessing a unique entry of a Dataset no longer returns 2D arrays but 1D arrays. Accessing a unique feature of a Dataset no longer returns a dictionary of arrays but an array. #270
MLQualityMeasure
no longer refits the transformers with cross-validation and bootstrap techniques. #273Improved the way
xlwings
objects are handled when anXLSDiscipline
runs in multiprocessing, multithreading, or both. #276A
CustomDOE
can be used without specifyingalgo_name
whose default value is"CustomDOE"
now. #282The
XLSDiscipline
no longer copies the original Excel file when bothcopy_xls_at_setstate
andrecreate_book_at_run
are set toTrue
. #287The post-processing algorithms plotting the objective function can now use the standardized objective when
OptimizationProblem.use_standardized_objective
isTrue
. When post-processing aScenario
, the name of a constraint passed to theOptPostProcessor
should be the value ofconstraint_name
passed toScenario.add_constraint
or the vale ofoutput_name
ifNone
. #302An
MDOFormulation
now shows anINFO
level message when a variable is removed from the design space because it is not an input for any discipline in the formulation. #304It is now possible to carry out a
SensitivityAnalysis
with multiple disciplines. #310The classes of the regression algorithms are renamed as
{Prefix}Regressor
. #322The constructor of
AutoPyDiscipline
now allows the user to select a custom name instead of the name of the Python function. #339It is now possible to serialize an
MDOFunction
. #342All
MDA
algos now count their iterations starting from0
. TheMDA.residual_history
is now a list of normed residuals. The argumentfigsize
inplot_residual_history
was renamed tofig_size
to be consistent with otherOptPostProcessor
algos. #343
API Changes#
See Upgrading GEMSEO for more information.
Version 3.2.2 (March 2022)#
Fixed#
Cache may not be used because of the way data was hashed.
Version 3.2.1 (November 2021)#
Fixed#
Missing package dependency declaration.
Version 3.2.0 (November 2021)#
Added#
The matrix linear problem solvers libraries are now handled by a Factory and can then be extended by plugins.
MDA warns if it stops when reaching
max_mda_iter
but before reaching the tolerance criteria.The convergence of an MDA can be logged.
Add max line search steps option in scipy L-BFGS-B
An analytical Jacobian can be checked for subsets of input and output names and components.
An analytical Jacobian can be checked from a reference file.
Scipy global algorithms SHGO and differential evolution now handle non linear constraints.
It is now possible to get the number of constraints not satisfied by a design in an OptimizationProblem.
The names of the scalar constraints in an OptimizationProblem can be retrieved as a list.
The dimensions of the outputs for functions in an OptimizationProblem are now available as a dictionary.
The cross-validation technique can now randomize the samples before dividing them in folds.
The Scatter Plot Matrix post processor now allows the user to filter non-feasible points.
OptPostProcessor can change the size of the figures with the method execute().
SensitivityAnalysis can plot indices with values standardized in [0,1].
MorrisAnalysis provides new indices: minimum, maximum and relative standard deviation.
MorrisAnalysis can compute indices normalized with the empirical output bounds.
A button to change the tagged version of GEMSEO is available on the documentation hosted by Read the Docs.
The documentation now includes a link to the gemseo-scilab plugin.
ParetoFront: an example of a BiLevel scenario to compute the Pareto front has been added the examples.
A Pareto front computation example using a bi-level scenario has been added to the documentation.
The documentation now includes hints on how to use the add_observable method.
It is now possible to execute DOEScenarios in parallel on Windows. This feature does not support the use of MemoryFullCache or HDF5Cache on Windows. The progress bar may show duplicated instances during the initialization of each subprocess, in some cases it may also print the conclusion of an iteration ahead of another one that was concluded first. This is a consequence of the pickling process and does not affect the computations of the scenario.
A ParameterSpace can be casted into a DesignSpace.
Plugins can be discovered via setuptools entry points.
A dumped MDODiscipline can now be loaded with the API function import_discipline().
Database has a name used by OptimizationProblem to name the Dataset; this is the name of the corresponding Scenario if any.
The grammar type can be passed to the sub-processes through the formulations.
Scenario, MDOScenario and DOEScenario now include the argument
grammar_type
.A GrammarFactory used by MDODiscipline allows to plug new grammars for data checking.
The coupling structure can be directly passed to an MDA.
Database has a name used by OptimizationProblem to name the Dataset; this is the name of the corresponding Scenario if any.
A dumped MDODiscipline can now be loaded with the API function
import_discipline
.The name of an MDOScenarioAdapter can be defined at creation.
The AbstractFullCache built from a Dataset has the same name as the dataset.
The HDF5 file generated by HDF5Cache has now a version number.
Changed#
The IO grammar files of a scenario are located in the same directory as its class.
Distribution, ParameterSpace and OpenTURNS use now the logger mainly at debug level.
The grammar types "JSON" and "Simple" are replaced by the classes names "JSONGrammar" and "SimpleGrammar".
RadarChart uses the scientific notation as default format for the grid levels and allows to change the discretization of the grid.
Fixed#
Make OpenTURNS- and pyDOE-based full factorial DOEs work whatever the dimension and number of samples.
The NLopt library wrapper now handles user functions that return ndarrays properly.
Fix bilevel formulation: the strong couplings were used instead of all the couplings when computing the inputs and outputs of the sub-scenarios adapters. Please note that this bug had an impact on execution performance, but had no adverse effect on the bilevel calculations in previous builds.
Bug with the 'sample_x' parameter of the pSeven wrapper.
An OptimizationProblem can now normalize and unnormalize gradient with uncertain variables.
A SurrogateDiscipline can now be instantiated from an MLAlgo saved without its learning set.
Bug with the 'measure_options' arguments of MLAlgoAssessor and MLAlgoSelection.
The constraints names are now correctly formed with the minus sign and offset value if any.
DesignSpace no longer logs an erroneous warning when unnormalizing an unbounded variable.
Resampling-based MLQualityMeasure no longer re-train the original ML model, but a copy.
The computation of a diagonal DOE out of a design space does not crash anymore.
OptimizationProblem no longer logs a warning when using the finite-difference method on the design boundary.
OpenTURNS options are processed correctly when computing a DOE out of a design space.
The Correlations post-processor now sorts labels properly when two or more functions share the same name followed by an underscore.
The ParetoFront post-processor now shows the correct labels in the plot axis.
The Gantt Chart, Basic History, Constraints History and Scatter Plot Matrix pages in the documentation now render the example plots correctly.
Post-processings based on SymLogNorm (matplotlib) now works with Python 3.6.
OptHistoryView no longer raises an exception when the Hessian diagonal contains NaN and skips the Hessian plot.
Bug with inherited docstrings.
The MDO Scenario example subsections are now correctly named.
The data hashing strategy used by HDF5Cache has been corrected, old cache files shall have to be converted, see the FAQ.
Fix levels option for Full-Factorial doe: now this option is taken into account and enables to build an anisotropic sample.
The constraints names are now correctly formed with the minus sign and offset value if any.
Bug with the MATLAB discipline on Windows.
The SurrogateDiscipline can now be serialized.
The name used to export an OptimizationProblem to a Dataset is no longer mandatory.
Bug in the print_configuration method, the configuration table is now shown properly.
Bug with integer elements casted into
The image comparison tests in post/dataset no longer leave the generated files when completed.
Typo in the function name get_scenario_differenciation.
ImportError (backport.unittest_mock) on Python 2.7.
Backward compatibility with the legacy logger named "GEMSEO".
DOE algorithms now have their own JSON grammar files which corrects the documentation of their options.
DOEScenario no longer passes a default number of samples to a DOELibrary for which it is not an option.
Issues when a python module prefixed with
gemseo_
is in the current working directory.DesignSpace can now be iterated correctly.
The Jacobian approximated by the finite-difference method is now correct when computed with respect to uncertain variables.
The standard deviation predicted by GaussianProcessRegression is now correctly shaped.
The input data to stored in a HDF5Cache are now hashed with their inputs names.
The hashing strategy used by HDF5Cache no longer considers only the values of the dictionary but also the keys.
Version 3.1.0 (July 2021)#
Changed#
Faster JSON schema and dependency graph creation.
The Gradient Sensitivity post processor is now able to scale gradients.
MemoryFullCache can now use standard memory as well as shared memory.
Sellar1 and Sellar2 compute y_1 and y_2 respectively, for consistency of naming.
Improve checks of MDA structure.
IDF: add option to start at equilibrium with an MDA.
Improve doc of GEMSEO study.
Unified drivers stop criteria computed by GEMSEO (xtol_rel, xtol_abs, ftol_rel, ftom_abs).
SimpleGrammars supported for all processes (MDOChain, MDAChain etc.).
JSONGrammar can be converted to SimpleGrammar.
DiscFromExe can now run executables without using the shell.
It is now possible to add observable variables to the scenario class.
ParetoFront post-processing improvements: legends have been added, it is now possible to hide the non-feasible points in the plots.
The Gradient Sensitivity, Variable Influence and Correlations post processors now show variables names instead of hard-coded names.
The Correlations post processor now allows the user to select a subset of functions to plot.
The Correlations post processor now allows the user to select the figure size.
Documentation improvements.
Added#
Support for Python 3.9.
Support for fastjsonschema up to 2.15.1.
Support for h5py up to 3.2.1.
Support for numpy up to 1.20.3.
Support for pyxdsm up to 2.2.0.
Support for scipy to 1.6.3.
Support for tqdm up to 4.61.0.
Support for xdsmjs up to 1.0.1.
Support for openturns up to 1.16.
Support for pandas up to 1.2.4.
Support for scikit-learn up to 0.24.2.
Support for openpyxl up to 3.0.7.
Support for nlopt up to 2.7.0.
Constraint aggregation methods (KS, IKS, max, sum).
N2: an interactive web N2 chart allowing to expand or collapse the groups of strongly coupled disciplines.
Uncertainty: user interface for easy access.
Sensitivity analysis: an abstract class with sorting, plotting and comparison methods, with a dedicated factory and new features (correlation coefficients and Morris indices).
Sensitivity analysis: examples.
Concatenater: a new discipline to concatenate inputs variables into a single one.
Gantt chart generation to visualize the disciplines execution time.
An interactive web N2 chart allowing to expand or collapse the groups of strongly coupled disciplines.
Support pSeven algorithms for single-objective optimization.
DOELibrary.compute_doe computes a DOE based on a design space.
Fixed#
The greatest value that OT_LHSC can generate must not be 0.5 but 1.
Internally used HDF5 file left open.
The Scatter Plot Matrix post processor now plots the correct values for a subset of variables or functions.
MDA Jacobian fixes in specific cases (self-coupled, no strong couplings, etc).
Strong coupling definition.
Bi-level formulation implementation, following the modification of the strong coupling definition.
Graphviz package is no longer mandatory.
XDSM pdf generation bug.
DiscFromExe tests do not fail anymore under Windows, when using a network directory for the pytest base temporary directory.
No longer need quotation marks on gemseo-study string option values.
XDSM file generated with the right name given with outfilename.
SellarSystem works now in the Sphinx-Gallery documentation (plot_sellar.py).
Version 3.0.3 (May 2021)#
Changed#
Documentation fixes and improvements.
Version 3.0.2 (April 2021)#
Changed#
First open source release!
Fixed#
Dependency version issue for python 3.8 (pyside2).
Version 3.0.1 (April 2021)#
Fixed#
Permission issue with a test.
Robustness of the excel discipline wrapper.
Version 3.0.0 (January 2021)#
Added#
Licenses materials.
Changed#
Renamed gems package to gemseo.
Removed#
OpenOPT backend which is no longer maintained and has features overlap with other backends.
Fixed#
Better error handling of the study CLI with missing latex tools.
Version 2.0.1 (December 2020)#
Fixed#
Improper configuration of the logger in the MDAChain test leading to GEMS crashes if the user has not write permission on the GEMS installation directory.
Max versions of h5py and Openturns defined in environment and configuration files to prevent incorrect environments due to API incompatibilities.
Max version of numpy defined in order to avoid the occurrence of a fmod/OpenBlas bug with Windows 10 2004 (https://developercommunity.visualstudio.com/content/problem/1207405/fmod-after-an-update-to-windows-2004-is-causing-a.html).
Version 2.0.0 (July 2020)#
Added#
Support for Python3
String encoding: all the strings shall now be encoded in unicode. For Python 2 users, please read carefully the Python2 and Python3 compatibility note to migrate your existing GEMS scripts.
Documentation: gallery of examples and tutorials + cheat sheet
New conda file to automatically create a Python environment for GEMS under Linux, Windows and Mac OS.
~35% improved performance on Python3
pyXDSM to generate latex/PDF XDSM
Display XDSM directly in the browser
Machine learning capabilities based on scikit-learn, OpenTURNS and scipy: clustering, classification, regression, dimension reduction, data scaling, quality measures, algorithm calibration.
Uncertainty package based on OpenTURNS and scipy: distributions, uncertain space, empirical and parametric statistics, Sobol' indices.
AbstractFullCache to cache inputs and outputs in memory
New Dataset class to store data from numpy arrays, file, Database and AbstractFullCache; Unique interface to machine learning and uncertainty algorithms.
Cache post-processing via Dataset
Make a discipline from an executable with a GUI
Excel-based discipline
Prototype a MDO study without writing any code and generating N2 and XDSM diagrams
Automatic finite difference step
Post-optimal analysis to compute the jacobian of MDO scenarios
Pareto front: computation and plot
New scalable problem from Tedford and Martins
New plugin mechanism for extension of features
Changed#
Refactored and much improved documentation
Moved to matplotlib 2.x and 3.x
Support for scipy 1.x
Improved API
Improved linear solvers robustness
Improved surrogate models based on machine learning capabilities and Dataset class.
Improved scalable models
Improved BasicHistory: works for design variables also
Improved XDSM diagrams for MDAChain
Improved BiLevel when no strong coupling is present
Improved overall tests
Fixed#
Bug in GradientSensitivity
Bug in AutoPyDiscipline for multiple returns and non pep8 code
Version 1.3.2 (December 2019)#
Fixed#
Bugfix in Discipline while updating data from the cache
Version 1.3.1 (July 2019)#
Added#
COBYLA handle NaNs values and manages it to backtrack. Requires specific mod of COBYLA by IRT
OptHistoryView and BasicHistory handle NaNs values
BasicHistory works for design variable values
Changed#
Improved error message when missing property in JSONGrammars
Improved imports to handle multiple versions of sklearn, pandas and sympy (thanks Damien Guenot)
Fixed#
Bug in Caching and Discipline for inouts (Thanks Romain Olivanti)
Bug in MDASequential convergence history
Version 1.3.0 (June 2019)#
Added#
Refactored and much improved documentation
All algorithms, MDAs, Surrogates, formulations options are now automatically documented in the HTML doc
Enhanced API: all MDO scenarios can be fully configured and run from the API
AutoPyDiscipline: faster way to wrap a Python function as a discipline
Surrogate models: Polynomial Chaos from OpenTurns
Surrogate model quality metrics:Leave one out, Q2, etc.
MDAs can handle self-coupled disciplines (inputs that are also outputs)
Lagrange Multipliers
Multi-starting point optimization as a bi-level scenario using a DOE
New aerostructure toy MDO problem
Changed#
Bi-Level formulation can now handle black box optimization scenarios, and external MDAs
Improve Multiprocessing and multithreading parallelism handling (avoid deadlocks with caches)
Improve performance of input / output data checks, x13 faster JSONGrammars
Improve performance of disciplines execution: avoid memory copies
Enhanced Scalable discipline, DOE is now based on a driver and inputs are read from a HDF5 cache like surrogate models
More readable N2 graph
Improved logging: fix issue with output files
Improved progress bar and adapt units for runtime prediction
NLOPT Cobyla: add control for init step of the DOE (rho)
Surrogate GPR: add options handling
Version 1.2.1 (August 2018)#
Added#
Handle integer variables in DOEs
Changed#
Improve performance of normalization/unnormalization
Improve x_xstar post processing to display the optimum
Fixed#
Issue to use external optimizers in a MDOScenario
Version 1.2.0 (July 2018)#
Added#
New API to ease the scenario creation and use by external platforms
mix parallelism multithreading / multiprocessing
much improved and unified plugin system with factories for Optimizers, DOE, MDAs, Formulations, Disciplines, Surrogates
Surrogate models interfaces
MDAJacobi is now much faster thanks to a new acceleration set of methods
Changed#
HTML documentation
Small improvements
Fixed#
Many bugs
Version 1.1.0 (April 2018)#
Added#
Mix finite differences in the discipline derivation and analytical jacobians or complex step to compute chain rule or adjoint method when not all disciplines' analytical derivatives are available
Ability to handle design spaces with integer variables
Analytic discipline based on symbolic calculation to easily create disciplines from analytic formulas
A scalable surrogate approximation of a discipline to benchmark MDO formulations
A HDF cache (= recorder) for disciplines to store all executions on the disk
The P-L-BFGS-B algorithm interface, a variant of LBFGSB with preconditioning coded in Python
Parallel (multiprocessing and / or multithreading) execution of disciplines and or call to functions
New constraints plot visualizations (radar chart) and constraints plot with values
Visualization to plot the distance to the best value in log scale ||x-x*||
Possibility to choose to normalize the design space or not for each variable
IDF improved for weakly coupled problems
On the fly backup of the optimization history (HDF5), in "append" mode
We can now monitor the convergence on the fly by creating optimization history plots at each iteration
Famous N2 plot in the CouplingStructure
Sphinx generated documentation in HTML (open doc/index.html), with:
GEMS in a nutshell tutorial
Discipline integration tutorial
Post processing description
GEMS architecture description
MDO formulations description
MDAs
Changed#
Improved automatically finding the best point in an optimization history
Improved callback functions during optimization / DOE
Improved stop criteria for optimization
Improved progress bar
Improved LGMRES solver for MDAs when using multiple RHS (recycle Krylov subspaces to accelerate convergence)
Fixed#
Many bugs
Version 1.0.0 (December 2016)#
Added#
Design of Experiment (DOE) capabilities from pyDOE, OpenTURNS or a custom samples set
Full differentiation of the process is available:
analytical gradient based optimization
analytical Newton type coupling solver for MDA (Multi Disciplinary Analyses)
analytical derivation of the chains of disciplines (MDOChain) via the chain rule
Post processing of optimization history: many plots to view the constraints, objective, design variables
More than 10 MDA (coupled problems) solver available, some gradient based (quasi newton) and hybrid multi-step methods (SequentialMDA) !
OptimizationProblem and its solution can be written to disk and post processed afterwards
Handling of DOE and optimization algorithm options via JSON schemas
Introduced an OptimizationProblem class that is created by the MDOFormulation and passed to an algorithm for resolution
Serialization mechanism for MDODiscipline and subclasses (write objects to disk)
Intensive testing: 500 tests and 98 % line coverage (excluding third party source)
Improved code coverage by tests from 95% to 98% and all modules have a coverage of at least 95%
Reduced pylint warnings from 800 to 40 !
Changed#
Code architecture refactoring for below items
Modularized post processing
Refactored algorithms part with factories
Removed dependency to json_schema_generator library, switched to GENSON (embedded with MIT licence)
Moved from JsonSchema Draft 3 to Draft 4 standard
Refactored the connection between the functions and the optimizers
Refactored MDOScenario
Refactored IDF formulation
Refactored Bilevel formulation
Refactored MDAs and introduced the CouplingStructure class
Refactored the DataProcessor for data interface with workflow engines
Refactored Sobieski use case to improve code quality
Included AGI remarks corrections on code style and best practices
Version 0.1.0 (April 2016)#
Added#
Basic MDO formulations: MDF, IDF, Bilevel formulations
Some optimization history views for convergence monitoring of the algorithm
Optimization algorithms: Scipy, OpenOPT, NLOPT
Possible export of the optimization history to the disk
Complex step and finite differences optimization
Benchmark cases:
Sobieski's Supersonic Business Jet MDO case
Sellar
Propane