Changelog#

All notable changes of this project will be documented here.

The format is based on Keep a Changelog and this project adheres to Semantic Versioning.

Version 6.0.0 (2024-11-08)#

Added#

Optimization & DOE#

  • HessianHistory is an OptPostProcessor to visualize the history of the diagonal of the Hessian matrix of the objective. #463

  • The optimizer MultiStart enables multi-start optimization, by combining an optimization algorithm and a DOE algorithm. #645

  • EvaluationProblem, of which OptimizationProblem is a subclass, allows to evaluate MDOFunction s over a DesignSpace and store the evaluations in a Database. #678

  • The drivers can now take the option store_jacobian (default: True). When set to False, the Jacobian matrices are not saved in the database. This reduces the RAM requirements for large optimization problems. #1094

  • The maximum dimension of a design space to be logged can be passed as the optional argument max_design_space_dimension_to_log of BaseDriverLibrary.execute. Hence it can be passed to any DOE library or optimization library as the option max_design_space_dimension_to_log. #1163

  • The MNBI multi-objective algorithm can be used to refine a specific part of the Pareto front. #1171

  • The sample_disciplines function allows to sample a system of disciplines based on an MDO formulation (default: MDF) and DOE settings (formerly in gemseo-mlearning). #1180

  • ProblemFunction is an MDOFunction wrapping another MDOFunction. Its func and jac methods call the corresponding methods of the underlying MDOFunction after pre-processing input value and before post-processing output value. Each function attached to an EvaluationProblem is replaced by a ProblemFunction when calling EvaluationProblem.preprocess_functions. Calls to the function can be counted via the enable_statistics and n_calls attributes. #1222

  • BaseToleranceTester is a new base class to check whether a tolerance criterion is met, with specific subclasses: DesignToleranceTester, ObjectiveToleranceTester and KKTConditionsTester. #1224

  • All the figures generated by OptHistoryView have a vertical red line representing the optimum iteration. #1236

  • Add support of SciPy's version of the COBYQA algorithm. #1261

  • SciPy's algorithms L-BFGS-B, NELDER-MEAD, TNC and SLSQP support the stop_crit_n_x option. #1265

  • AlgorithmDescription and its derived classes now include the class attribute AlgorithmDescription.settings to store algorithm-specific settings. This is useful specially for algorithm libraries that wrap different algorithms in a single class. The following linear solvers have been added to ScipyLinalgAlgos:

    • Conjugate Gradient (CG),

    • Conjugate Gradient Stabilized (CGS),

    • Generalized Conjugate Residual with Optimal Truncation (GCROT),

    • Transpose-Free Quasi-Minimum Residual (TFQMR).

    The ScipyLinprog library now handles integer variables. #1450

MDO processes#

  • The MDA settings can now be passed at instantiation as a Pydantic model as well as key/value pairs. The following properties has been added to BaseMDASolver:

    • acceleration_method,

    • over_relaxation_factor.

    #1322

  • The JobSchedulerDisciplineWrapper and the deserialize_and_run entry point implement _compute_jacobian. #1191

  • RemappingDiscipline accepts an empty name mapping, in which case no remapping is applied. #1197

  • BackupSettings is a dataclass to pass the backup settings to functions relying on BaseScenario.set_optimization_history_backup, e.g. sample_disciplines and BaseSensitivityAnalysis.compute_samples. #1204

  • The differentiated_input_names_substitute argument of BaseFormulation, BaseMDOFormulation, MDF, IDF, BiLevel and DisciplinaryOpt defines the names of the differentiated inputs; if empty, the formulation will consider all the inputs. It enables derivatives to be calculated with respect to discipline inputs that are not in the design space attached to the formulation. #1207

  • ArrayBasedFunctionDiscipline wraps a function whose both the unique argument and the return value are NumPy arrays. #1219

  • Add the GEMSEO Web study GUI to the documentation. #1232

  • XDSMizer can be applied to any discipline, and not just the scenarios. This is useful for seeing the dataflow and workflow of a multidisciplinary process, e.g. MDAJacobi.

  • generate_xdsm can generate the XDSM of any discipline. #1257

  • The IDF formulation now accepts options at instantiation for the MDAChain executed at the begining of the process when start_at_equilibrium=True. #1259

  • BaseMDARoot has a new argument execute_before_linearizing to execute the disciplines before linearizing them. #1278

Surrogate models#

  • The package gemseo.utils.metrics include metrics to compare two quantities:

    • The base class is BaseMetric and its factory is MetricFactory.

    • BaseCompositeMetric is the base class for metrics relying on another metric.

    • The ElementWiseMetric is a composite metric to compare two collections using an underlying metric; it returns a collection.

    • DatasetMetric is a composite metric to compare two Dataset s row-wisely using an underlying metric; it returns a Dataset.

    • The MeanMetric is a composite metric to compare two collections using an underlying metric; it returns an array.

    • The SquaredErrorMetric is a composite metric returning the squared difference between two quantities.

    #1024

  • The quality of an MLRegressorAlgo can be assessed by plotting its cross-validation error. #1122

  • MAEMeasure and MEMeasure are respectively the mean absolute error and the maximum error to assess the quality of a BaseRegressor (formerly in gemseo_mlearning).

  • GradientBoostingRegressor wraps the scikit-learn's gradient boosting regressor (formerly in gemseo_mlearning).

  • MLPRegressor wraps the scikit-learn's multilayer perceptron (MLP) (formerly in gemseo_mlearning).

  • OTGaussianProcessRegressor wraps the OpenTURNS' Gaussian process (GP) regressor (formerly in gemseo_mlearning).

  • SVMRegressor wraps the scikit-learn's support vector machine (SVM) regressor (formerly in gemseo_mlearning).

  • TPSRegressor is a specific RBFRegressor for thin plate spline regression (TPS) (formerly in gemseo_mlearning).

  • RegressorChain is a composition of BaseRegressor objects trained sequentially, each one learning the relationship between the inputs and the error made by the previous one (formerly in gemseo_mlearning). #1128

  • Quality assessment:

    • ClassifierQualityFactory is a factory of objects to assess the quality of classification algorithms.

    • ClustererQualityFactory is a factory of objects to assess the quality of clustering algorithms.

    • BaseClassifierQuality is the base class to assess the quality of classification algorithms.

    #1129

  • OptimizationProblem.evaluation_counter is an EvaluationCounter to count the number of times a new iteration is stored in OptimizationProblem.database. #1135

  • OTGaussianProcessRegressor has a new method compute_samples to generate samples from the conditioned Gaussian process. #1140

  • Resampler.plot can be used to visualize the train-test partitions. #1156

Sensitivity analysis#

  • SobolAnalysis.compute_indices can estimate the Sobol' indices more precisely using control variates. #1185

  • Use the optional argument n_replicates of SobolAnalysis.compute_indices to set the number of replicates to estimate the confidence intervals by bootstrap. #1189

  • BaseSensitivityAnalysis and its subclasses (MorrisAnalysis, SobolAnalysis, CorrelationAnalysis and HSICAnalysis) handles backup and crash management with the backup_settings argument of the compute_samples method.

  • MorrisDOE is a DOE algorithm used by MorrisAnalysis; it repeats an oat-at-a-time (OAT) sampling N times, starting from N different points selected from a DOE algorithm.

  • OATDOE is a DOE algorithm used by MorrisDOE; it applies the oat-at-a-time (OAT) sampling strategy, given an initial point. #1213

Miscellaneous#

  • Support for Python 3.12 #1530

  • compare_dict_of_arrays returns False in the case of variables with arrays of different sizes. #1049

  • MDANewtonRaphson.NewtonLinearSolver is the enumeration of linear solvers for the Newton method. #1084

  • MDOLinearFunction.normalize transforms a basic MDOLinearFunction into an MDOLinearFunction using a scaled input vector. #1104

  • Sellar problem:

    • The local design variables and the coupling variables are vectors of dimension \(n\) (default: 1), instead of scalars.

    • Sellar2 has a new local design variable \(x_2\) which intervenes also in the objective expression (default: 0)

    • The disciplines Sellar1 and Sellar2 have a coefficient \(k\) to control the strength of the coupling (default: 1).

    • The coefficient 0.2 in Sellar1 is now an input variable named \(\gamma\) (default: 0.2).

    • The coefficient 3.16 in SellarSystem is now an input variable named \(\alpha\) (default: 3.16).

    • The coefficient 24.0 in SellarSystem is now an input variable named \(\beta\) (default: 24.0).

    #1164

  • The function to_pickle saves the pickled representation of an object on the disk.

  • The function from_pickle loads the pickled representation of an object from the disk.

  • BaseFormulation.variable_sizes stores the sizes of both the design variables and the differentiated inputs. #1230

  • String tools:

    • gemseo.utils.string_tools.get_name_and_component returns a (name, component) tuple from a name or (name, component) object.

    • gemseo.utils.string_tools.convert_strings_to_iterable returns an iterable of strings from a string or an iterable of strings.

    • gemseo.utils.string_tools.filter_names filters original names from a selection of names to keep by preserving their order.

    • gemseo.utils.string_tools.get_variables_with_components returns a collection of (name, component) objects from a name or (name, component) object or a collection of such objects.

    #1243

  • The function import_database can create either a Dataset or a Database from the HDF5 file storing the Database associated to an EvaluationProblem.

  • Database has an optional argument called input_space to define the input space to which the input vectors belong; by default, this input space includes a unique variable called Database.DEFAULT_INPUT_NAME. #1303

  • The execute method of post-processing tools (see BasePost) returns a dictionary of matplotlib figures of DatasetPlot, depending on whether or not it is based on a DatasetPlot. This allows interactive visualization in a web page when the HTML format is supported by the DatasetPlot. This is the case of BasicHistory whose HTML version is based on the plotly library. When available, set file_extension to "html". #1308

  • DesignSpace.to_scalar_variables() creates a design space of scalar variables from a design space. #1333

  • generate_coupling_graph does not save the graph when file_path is empty. This can be useful when we simply want to use the returned GraphView, to display it in a web page or a notebook for example. #1341

Fixed#

Optimization & DOE#

  • The Database is now correctly appended at each iteration when scalar and vectorial outputs are mixed. #1194

  • DesignSpace.filter_dimensions (formerly DesignSpace.filter_dim) updates the mapping DesignSpace.__names_to_indices #1218

  • CenteredDifferences.f_gradient works when CenteredDifferences is not instantiated from a ``DesignSpace. #1253

  • The algorithm MNBI correctly handles the option normalize_design_space. #1255

  • When the computation of the Lagrange multipliers crashes with scipy.optimize.nnls, fall back to scipy.optimize.lsq_linear. #1340

  • Algorithm "Augmented_Lagrangian_order_1" now updates all the Lagrange multipliers, even those which correspond to constraints that are inactive at the current iteration. #1342

  • DesignSpace.normalize not longer erroneously flags design variables with equal lower and upper bounds as not to be normalized. #1346

MDO processes#

  • When building the warm started chain it now considers the variables coming from the MDA 1 and the MDA 2. #1116

  • The method MDA.plot_residual_history() no longer crashes when the MDA has been executed more than once and the n_iterations argument is smaller than the total amount of iterations. #1157

  • The DesignSpace attached to the MDF formulation can be reset. #1179

  • MDOParallelChain handles disciplines whose inputs are not NumPy arrays. #1209

  • Remove the update of local data performed in the end of the MDAGaussSeidel. #1251

  • When changing the residual scaling method of an MDASequential or and MDAChain, the change is applied to all inner MDAs. #1280

  • MDAChain correctly runs with initialize_defaults set to True when some outputs of the chain are not coupling variables. #1281

  • The tolerance and linear_solver_tolerance args are correctly passed to the main and sub-MDAs in MDAGSNewton. #1294

  • Local data in MDA are no longer modified via self.local_data[key] = value but rather using the store_local_data method to ensure proper handling of namespaces. #1309

  • The XDSM generation through PDF format can now handle MDA within an MDOParallelChain with the correct workflow. #1441

Surrogate models#

  • MLQualityViewer displays scalar outputs correctly. #1200

  • GaussianProcessRegressor.predict_std and GaussianProcessRegressor.compute_samples consider the output scales deduced from the training data. #1273

  • BaseMLSupervisedAlgo can be trained from an IODataset with homonymous inputs and outputs. #1274

Miscellaneous#

  • The DependencyGraph represents homonymous disciplines with specific nodes. #1264

  • BaseAlgorithmLibrary and its derived classes no longer allow users to pass settings that are not included in the validation model. In the past, unknown settings were allowed to pass up to the wrapped algorithm level, which was error prone. The ineq_tolerance was not used for the Scipy_MILP algorithm, it is now done. #1450

Changed#

  • Database.get_function_history raises a KeyError when the database contains no value of the output, instead of returning an empty array.

  • RemappingDiscipline uses the same types for the grammar elements as the grammars of the underlying discipline. #1095

  • IODataset: the "group name, default variable name" pairs for the IODataset are "i, inputs" and "o, outputs". OptimizationDataset: the "group name, default variable name" pairs for the OptimizationDataset are "d, designs", "o, observables", "f, objectives" and "c, constraints". #1275

  • XDSM: the pdf_cleanup option removes all intermediate files, including .tikz and .tex files. #1263

  • The convergence of MDAs is now checked after the execution of the disciplines instead of after the acceleration. This change affects the absolute value of the output coupling variables when running an MDA with acceleration methods. The difference has an order of magnitude at most equal to the MDA tolerance. The change should be harmless in most cases but could anyway have effects for numerically sensitive problems. #1251

  • The post-processing algorithms use the 10-based logarithm. The post-processing algorithms using matplotlib use the tight_layout() command. #593

Removed#

  • OptHistoryView no longer plots the history of the diagonal of the Hessian matrix of the objective; use HessianHistory instead. #463 - The QMR linear solver has been removed from ScipyLinalgAlgos. #1450

API Changes#

See Upgrading GEMSEO for more information.

Version 5.3.2 (2024-08-08)#

Added#

  • The IDF formulation now accepts options at instantiation for the MDAChain executed at the begining of the process when start_at_equilibrium=True. #1259

Fixed#

  • Remove the update of local data performed in the end of the MDAGaussSeidel. #1251

  • BaseScenario.to_dataset can export the Database to a Dataset in presence of homonymous inputs and outputs.

Changed#

  • The convergence of MDAs is now checked after the execution of the disciplines instead of after the acceleration. This change affects the absolute value of the output coupling variables when running an MDA with acceleration methods. The difference has an order of magnitude at most equal to the MDA tolerance. The change should be harmless in most cases but could anyway have effects for numerically sensitive problems. #1251

Version 5.3.1 (2024-06-06)#

Fixed#

  • The method OptimizationProblem.to_dataset now considers the type of each variable. #1154

  • GEMSEO no longer sets the maxiter option of TNC that does not exist but maxfun. #1181

  • The Database export to an HDF5 file works in "append" mode and when storing data successively for the same input value. #1216

Version 5.3.0 (2024-03-28)#

Added#

  • The SciPy implementation of the Nelder-Mead gradient-free algorithm is now available. #875

  • HSICAnalysis.compute_indices also computes the p-values for screening purposes in two different ways: through permutations and from asymptotic formula. #992

  • Added the modified Normal Boundary Intersection (mNBI) multi-objective optimization algorithm, for use with "MNBI" as algorithm name. Added the class MultiObjectiveOptimizationResult to store and display results from optimization problems with more than one objective. Added the class ParetoFront to store the points of interest of a MultiObjectiveOptimizationResult. Added the Poloni analytical multi-objective optimization problem. Added the FonsecaFlemming analytical multi-objective optimization problem. Added the Viennet analytical multi-objective optimization problem. #1012

  • The method DataConverter.is_continuous can tell if a variable is continuous. #1066

  • The Dataset class can now be created from a DataFrame using the new class method: from_dataframe. #1069

  • The boolean attribute DatasetPlot.grid allows to add a grid to the DatasetPlot. #1074

  • HSICAnalysis.compute_indices proposes two new sensitivity analysis (SA) types, namely conditional SA and target SA. #1078

  • Lines and BarPlot have HTML-based interactive versions based on plotly. #1082

  • The method AbstractCache.get_all_entries that returns all the entries, whatever the tolerance.

  • The module gemseo.typing that contains common type annotations. #1090

  • Dataset.from_csv has a new attribute first_column_as_index, which permits to read csv files that contain the index as the first column. #1093

  • MDOFunction supports elementwise multiplication and division by NumPy arrays.

  • Addition, substraction, multiplication and division of a function expecting normalized inputs with a function that does not raise RuntimeError.

  • The function values \(f(x)\) passed to optimizers can optionally be scaled relative to \(|f(x_0)|\) where \(x_0\) is the current value of the DesignSpace. This functionality is enabled by passing a positive value \(\epsilon\) as the option scaling_threshold of any optimizer: the function values passed to the optimizer are then \(f(x) / \max\{ |f(x_0)|, \epsilon \}\). The purpose of \(\epsilon\) is to avoid division by a value close to zero. The (default) value None for the option scaling_threshold disables the scaling. #1100

  • DOELibrary.compute_doe and compute_doe can use a variables space dimension instead of a variables space. #1102

  • Each DOE algorithm available in GEMSEO has a new option named "callbacks" to pass a list of functions to be evaluated after each call to OptimizationProblem.evaluate_functions. #1111

  • Implement "hdf_node_path" for "opt_problem","design_space" and "database".

  • Allow "opt_problem" to be exported/imported to/from a node in a specified hdf file. #1119

  • The plotting methods of SensitivityAnalysis and Statistics classes return DatasetPlot objects or figures from data visualization libraries, e.g. Matplotlib and Plotly.

  • A Plotly figure can be passed to a Plotly plot. #1121

  • OptimizationProblem.get_all_functions has a new argument original to return the original functions given to the problem. #1126

  • A property figures allows to retrieve the figures generated by a DatasetPlot. #1130

  • DriverLibrary.clear_listeners removes the listeners added by the DriverLibrary from the Database attached to the OptimizationProblem. #1134

  • The Seeder class is a seed generator for features using random numbers, e.g. DOELibrary; its get_seed method returns either the user seed or the initial seed incremented by the number of calls. #1148

Fixed#

  • The stratified DOE algorithms OT_AXIAL, OT_COMPOSITE and OT_FACTORIAL correctly support the arguments n_samples, centers, dimension and levels. #88

  • The MDODiscipline can be linearized after execution when its cache_type is set to MDODiscipline.CacheType.None and both inputs and outputs arguments of linearize are empty. #804

  • MDAJacobi and MDAGaussSeidel have now different XDSM representations which are in line with the convention proposed in [LM12]. MDAChain is not represented anymore in the XDSM. Add tests for the pdf generation of XDSMs. Bugfixes for XDSM pdf generation. #1062

  • The expression LinearComposition.expr is now correct. #1063

  • Non-continuous variables can no longer be differentiated. #1066

  • The upper bound KS aggregation function is really called when aggregating constraints in an OptimizationProblem #1075

  • The user no longer has to provide an initial design point to solve an optimization problem with gradient approximation. #1076

  • The method Scenario.set_optimization_history_backup() no longer causes the execution to crash at the first iteration when the Scenario includes equality or inequality constraints. #1089

  • The DirectoryCreator can now consider non-existing root directory while using DirectoryNamingMethod.NUMBERED. #1097

  • The missing closing parenthesis in the expression of Rosenbrock is no longer missing.

  • Addition, substraction, multiplication and division of functions expecting normalized inputs yield functions expecting normalized inputs. #1100

  • CustomDOE.compute_doe no longer raises an error and works correctly. #1103

  • The axes generated by EmpiricalStatistics.plot_cdf are no longer switched. #1105

  • The BiLevel formulation can now be warm started even when the MDA1 does not exist (case of weakly coupled disciplines). #1107

  • The attribute ParameterSpace.distributions is correctly updated when renaming a random variable. #1108

  • OptimizationProblem.database is not used when use_database is False in the case of a DOELibrary. #1110

  • Dataset.to_dict_of_arrays no longer raises an AttributeError when both by_entry and by_group are True and works properly. #1112

  • When function_calls is True, OptimizationProblem.reset resets the number of calls of the original functions. #1126

  • DesignSpace.rename_variable can be applied to a variable without value. #1127

  • Requesting an optimized LHS with size 1 raises an exception instead of a freeze. #1133

  • Database cannot store the same listener several times. #1134

  • The Alternate2Delta method now handles degenerated (ill-conditioned) least squares problems. In this case, the method now returns the iterate without transformations. #1137

  • Dataset.add_group works correctly when variable_names defines a single variable. #1138

  • The transformers passed to an MLAlgo are correctly applied when the fit_transformers argument of the learn method is False. #1146

  • The MDOChain Jacobian is made reproducible, making the sum of composite derivative terms in an order that does not depend on the code execution. #1150

  • The PydanticGrammar was not able to validate DisciplineData objects. #1153

Changed#

  • The methods OptimizationProblem.from_hdf() and OptimizationProblem.to_hdf() no longer log messages when they are called. The method Database.to_ggobi() no longer logs messages when it is called. #579

  • The option disp of the SciPy algorithms shall now be passed as a boolean instead of an integer. #875

  • The method Scenario.set_optimization_history_backup() now starts generating plots only after the first two iterations have been computed. The OptHistoryView plots created by Scenario.set_optimization_history_backup() with the option generate_opt_plot are no longer updated at each Database.store(), only at each new iteration. #1089

  • API change: AbstractCache._{INPUTS,OUTPUTS,JACOBIAN}_GROUP has been replaced by AbstractCache.Group. #1090

  • Methods execute and linearize of gemseo.problems.sobieski.core.structure.SobieskiStructure catch the ValueError raised by the computation of the logarithm of a non-positive weight ratio. Method execute returns numpy.nan for the mass term. #1101

  • It is now possible to solve MDA instances that include non-numeric couplings (weak or strong), typically strings or arrays of string. The non-numeric couplings are automatically filtered during the numerical solution of the MDA. A warning message is shown in the log at DEBUG level with the variables that were filtered. #1124

  • Database.clear_listeners returns the listeners after removing them from the Database. #1134

  • OptimizationProblem.objective = mdo_function sets mdo_function.f_type to mdo_function.FunctionType.OBJ; no need to do it by hand anymore. #1141

  • The argument uniform_distribution_name of IshigamiProblem and IshigamiSpace allows to use a uniform distribution from a library other than SciPy, e.g. OpenTURNS. #1143

  • API change: SEED moved to gemseo.utils.seeder. #1148

Removed#

  • Support for reStructuredText docstring format.

  • The function get_default_option_values; use inspect.get_callable_argument_defaults(cls.__init__) instead of get_default_option_values(cls). #1059

Version 5.2.0 (2023-12-20)#

Added#

  • Setting file_format="html" in DatasetPlot.execute saves on the disk and/or displays in a web browser a plotly-based interactive plot. DatasetPlot.DEFAULT_PLOT_ENGINE is set to PlotEngine.MATPLOTLIB; this is the default plot engine used by DatasetPlot. DatasetPlot.FILE_FORMATS_TO_PLOT_ENGINES maps the file formats to the plot engines to override the default plot engine. #181

  • Add OptimizationProblem.get_last_point method to get the last point of an optimization problem. #285

  • The disciplines Concatenater, LinearCombination and Splitter now have sparse Jacobians. #423

  • The method EmpiricalStatistics.plot_barplot generates a boxplot for each variable. The method EmpiricalStatistics.plot_cdf draws the cumulative distribution function for each variable. The method EmpiricalStatistics.plot_pdf draws the probability density function for each variable. #438

  • MLRegressorQualityViewer proposes various methods to plot the quality of an MLRegressionAlgo`. DatasetPlot.execute can use a file name suffix. SurrogateDiscipline.get_quality_viewer returns a MLRegressorQualityViewer. #666

  • ScatterMatrix can set any option of the pandas scatter_matrix function. ScatterMatrix can add trend curves on the scatter plots, with either the enumeration ScatterMatrix.Trend or a custom fitting technique. Scatter can add a trend curve, with either the enumeration Scatter.Trend or a custom fitting technique. #724

  • ScenarioResult is a new concept attached to a Scenario. This concept enables to post-process more specifically the results of a given scenario. In particular, the ScenarioResult can be derived in order to implement dedicated post-treatments depending on the formulation.

    • OptimizationResult.from_optimization_problem creates an OptimizationResult from an OptimizationProblem.

    • BaseFormulation.DEFAULT_SCENARIO_RESULT_CLASS_NAME is the name of the default OptimizationResult class to be used with the given formulation.

    • ScenarioResult stores the result of a Scenario from a Scenario or an HDF5 file.

    • BiLevelScenarioResult is a ScenarioResult to store the result of a Scenario using a BiLevel formulation.

    • ScenarioResultFactory is a factory of ScenarioResult.

    • Scenario.get_result returns the result of the execution of the Scenario as a ScenarioResult.

    • create_scenario_result stores the result of a Scenario from a Scenario or an HDF5 file.

    #771

  • The LinearCombination discipline now has a sparse Jacobian. #809

  • The normalize option of BasicHistory scales the data between 0 and 1 before plotting them. #841

  • The type of the coupling variables is no longer restricted to NumPy arrays thanks to data converters attached to grammars. #849

  • gemseo.mlearning.sampling is a new package with resampling techniques, such as CrossValidation and Bootstrap. MLAlgo.resampling_results stores the resampling results; a resampling result is defined by a Resampler, the machine learning algorithms generated during the resampling stage and the associated predictions. The methods offered by MLQualityMeasure to estimate a quality measure by resampling have a new argument called store_resampling_result to store the resampling results and reuse them to estimate another quality measure faster. #856

  • SciPyDOE is a new DOELibrary based on SciPy, with five algorithms: crude Monte Carlo, Halton sequence, Sobol' sequence, Latin hypercube sampling and Poisson disk sampling. #857

  • When third-party libraries do not handle sparse Jacobians, a preprocessing step is used to convert them as dense NumPy arrays. #899

  • R2Measure.evaluate_bootstrap is now implemented. #914

  • Add diagrams in the documentation to illustrate the architecture and usage of ODEProblem. #922

  • MDA can now handle disciplines with matrix-free Jacobians. To define a matrix-free Jacobian, the user must fill in the MDODiscipline.jac dictionary with JacobianOperator overloading the _matvec and _rmatvec methods to respectively implement the matrix-vector and transposed matrix-vector product. #940

  • The SimplerGrammar is a grammar based on element names only. SimplerGrammar` is even simpler than ``SimpleGrammar which considers both names and types. #949

  • HSICAnalysis is a new SensitivityAnalysis based on the Hilbert-Schmidt independence criterion (HSIC). #951

  • Add the Augmented Lagrangian Algorithm implementation. #959

  • Support for Python 3.11. #962

  • Optimization problems with inequality constraints can be reformulated with only bounds and equality constraints and additional slack variables thanks to the public method: OptimizationProblem.get_reformulated_problem_with_slack_variables. #963

  • The subtitle of the graph generated by SobolAnalysis.plot includes the standard deviation of the output of interest in addition to its variance. #965

  • OTDistributionFactory is a DistributionFactory limited to OTDistribution objects. SPDistributionFactory is a DistributionFactory limited to SPDistribution objects. The base_class_name attribute of get_available_distributions can limit the probability distributions to a specific library, e.g. "OTDistribution" for OpenTURNS and "SPDistribution" for SciPy. #972

  • The use_one_line_progress_bar driver option allows to display only one iteration of the progress bar at a time. #977

  • OTWeibullDistribution is the OpenTURNS-based Weibull distribution. SPWeibullDistribution is the SciPy-based Weibull distribution. #980

  • MDAChain has an option to initialize the default inputs by creating a MDOInitializationChain at first execution. #981

  • The upper bound KS function is added to the aggregation functions. The upper bound KS function is an offset of the lower bound KS function already implemented. #985

  • CenteredDifferences Approximation mode is now supported for jacobian computation. This can be used to calculate MDODiscipline and MDOFunctions jacobians setting the jacobian approximation mode as for the Finite Differences and the Complex Step schemes. This is a second order approach that employs twice points but as a second order accuracy with respect to the Finite Difference scheme. When calculating a Centered Difference on one of the two bounds of the Design Space, the Finite Difference scheme is used instead. #987

  • The class SobieskiDesignSpace deriving from DesignSpace can be used in the Sobieski's SSBJ problem. It offers new filtering methods, namely filter_coupling_variables and filter_design_variables. #1003

  • The MDODiscipline can flag linear relationships between inputs and outputs. This enables the FunctionFromDiscipline generated from these MDODiscipline to be instances of LinearMDOFunction. An OptimizationProblem is now by default a linear problem unless a non-linear objective or constraint is added to the optimization problem. #1008

  • The following methods now have an option as_dict to request the return values as dictionaries of NumPy arrays instead of straight NumPy arrays: DesignSpace.get_lower_bounds, DesignSpace.get_upper_bounds, OptimizationProblem.get_x0_normalized and DriverLibrary.get_x0_and_bounds_vects. #1010

  • gemseo.SEED is the default seed used by GEMSEO for random number generators. #1011

  • HiGHS solvers for linear programming interfaced by SciPy are now available. #1016

  • Augmented Lagrangian can now pass some of the constraints to the sub-problem and deal with the rest of them thanks to the sub_problem_constraints option. #1026

  • An example on the usage of the MDODiscipline.check_jacobian method was added to the documentation. Three derivative approximation methods are discussed: finite differences, centered differences and complex step. #1039

  • The TaylorDiscipline class can be used to create the first-order Taylor polynomial of an MDODiscipline at a specific expansion point. #1042

  • The following machine learning algorithms have an argument random_state to control the generation of random numbers: RandomForestClassifier, SVMClassifier, GaussianMixture, KMeans, GaussianProcessRegressor, LinearRegressor and RandomForestRegressor. Use an integer for reproducible results (default behavior). #1044

  • BaseAlgoFactory.create initializes the grammar of algorithm options when it is called with an algorithm name. #1048

Fixed#

  • There is no longer overlap between learning and test samples when using a cross-validation technique to estimate the quality measure of a machine learning algorithm. #915

  • Security vulnerability when calling subprocess.run with shell=True. #948

  • Fixed bug on LagrangeMultipliers evaluation when bound constraints are activated on variables which have only one bound. #964

  • The iteration rate is displayed with appropriate units in the progress bar. #973

  • AnalyticDiscipline casts SymPy outputs to appropriate NumPy data types (as opposed to systematically casting to float64). #974

  • AnalyticDiscipline no longer systematically casts inputs to float. #976

  • MDODiscipline.set_cache_policy can use MDODiscipline.CacheType.NONE as cache_type value to remove the cache of the MDODiscipline. #978

  • The normalization methods of DesignSpace do no longer emit a RuntimeWarning about a division by zero when the lower and upper bounds are equal. #1002

  • The types used with PydanticGrammar.update_from_types with merge=True are taken into account. #1006

  • DesignSpace.dict_to_array returns an ndarray whose attribute dtype matches the "common dtype" of the values of its dict argument design_values corresponding to the keys passed in its argument variables_names. So far, the dtype was erroneously based on all the values of design_values. #1019

  • DisciplineData with nested dictionary can now be serialized with json. #1025

  • Full-factorial design of experiments: the actual number of samples computed from the maximum number of samples and the dimension of the design space is now robust to numerical precision issues. #1028

  • DOELibrary.execute raises a ValueError when a component of the DesignSpace is unbounded and the DesignSpace is not a ParameterSpace. DOELibrary.compute_doe raises a ValueError when unit_sampling is False, a component of the design space is unbounded and the DesignSpace is not a ParameterSpace. #1029

  • OptimizationProblem.get_violation_criteria no longer considers the non-violated components of the equality constraints when calculating the violation measure. #1032

  • A JSONGrammar using namespaces can be serialized correctly. #1041

  • RadarChart displays the constraints at iteration i when iteration=i. #1054

Changed#

  • API:

    • The class RunFolderManager is renamed DirectoryGenerator.

    • The class FoldersIter is renamed Identifiers.

    • The signature of the class DirectoryGenerator has changed:

      • folders_iter is replaced by identifiers

      • output_folder_basepath is replaced by root_directory

    #878

  • The subpackage gemseo.mlearning.data_formatters includes the DataFormatters used by the learning and prediction methods of the machine learning algorithms. #933

  • The argument use_shell of the discipline DiscFromExe is no longer taken into account, executable are now always executed without shell. #948

  • The existing KS function aggregation is renamed as lower_bound_KS. #985

  • The log of the ProgressBar no longer displays the initialization of the progress bar. #988

  • The samples option of the algorithm CustomDOE can be a 2D-array shaped as (n_samples, total_variable_size), a dictionary shaped as {variable_name: variable_samples, ...} where variable_samples is a 2D-array shaped as (n_samples, variable_size) or an n_samples-length list shaped as [{variable_name: variable_sample, ...}, ...] where variable_sample is a 1D-array shaped as (variable_size, ). #999

  • PydanticGrammar have been updated to support Pydantic v2. For such grammars, NumPy ndarrays shall be typed with gemseo.core.grammars.pydantic_ndarray.NDArrayPydantic instead of the standard ndarray or NDArray based of annotations. #1017

  • The example on how to do a Pareto Front on the Binh Korn problem now uses a BiLevel formulation instead of an MDOScenarioAdapter manually embedded into a DOEScenario. #1040

  • ParameterSpace.__str__ no longer displays the current values, the bounds and the variable types when all the variables are uncertain. #1046

Removed#

  • Support for Python 3.8. #962

Version 5.1.1 (2023-10-04)#

Security#

Upgrade the dependency pillow to mitigate a vulnerability.

Version 5.1.0 (2023-10-02)#

Added#

  • The argument scenario_log_level of MDOScenarioAdapter allows to change the level of the root logger during the execution of its scenario.

  • The argument sub_scenarios_log_level of BiLevel allows to change the level of the root logger during the execution of its sub-scenarios. #370

  • DesignSpace has a pretty HTML representation. #504

  • The method add_random_vector() adds a random vector with independent components to a ParameterSpace from a probability distribution name and parameters. These parameters can be set component-wise. #551

  • The high-level function create_dataset returns an empty Dataset by default with a default name. #721

  • OptimizationResult has new fields x_0_as_dict and x_opt_as_dict bounding the names of the design variables to their initial and optimal values. #775

  • Enable the possibility of caching sparse Jacobian with cache type HDF5Cache. #783

  • Acceleration methods for MDAs are defined in dedicated classes inheriting from SequenceTransformer.

    Available sequence transformers are:

    • The alternate 2-δ method: Alternate2Delta.

    • The alternate δ² method: AlternateDeltaSquared.

    • The secante method: Secante.

    • The Aitken method: Aitken.

    • The minimum polynomial method: MinimumPolynomial.

    • The over-relaxation: OverRelaxation.

    #799

  • The values of the constraints can be passed to method OptimizationProblem.get_number_of_unsatisfied_constraints. #802

  • RegressorQualityFactory is a factory of MLErrorMeasure.

  • SurrogateDiscipline.get_error_measure returns an MLErrorMeasure to assess the quality of a SurrogateDiscipline; use one of its evaluation methods to compute it, e.g. evaluate_learn to compute a learning error. #822

  • The DatasetFactory is a factory of Dataset.

  • The high-level function create_dataset can return any type of Dataset. #823

  • Dataset has a string property summary returning some information, e.g. number of entries, number of variable identifiers, ... #824

  • MLAlgo.__repr__ returns the same as MLAlgo.__str__ before this change and MLAlgo.__str__ does not overload MLAlgo.__repr__. #826

  • The method Dataset.to_dict_of_arrays can break down the result by entry with the boolean argument by_entry whose default value is False. #828

  • Added Scipy MILP solver wrapper. #833

  • DesignSpace.get_variables_indexes features a new optional argument use_design_space_order to switch the order of the indexes between the design space order and the user order. #850

  • ScalableProblem.create_quadratic_programming_problem handles the case where uncertain vectors are added in the coupling equations. #863

  • MDODisciplineAdapterGenerator can use a dictionary of variable sizes at instantiation. #870

  • The multi-processing start method (spawn or fork) can now be chosen. #885

  • Acceleration methods and over-relaxation are now available for MDAJacobi, MDAGaussSeidel and MDANewtonRaphson. They are configured at initialization via the acceleration_method and over_relaxation_factor and can be modified afterward via the attributes MDA.acceleration_method and MDA.over_relaxation_factor.

    Available acceleration methods are:

    • Alternate2Delta,

    • AlternateDeltaSquared,

    • Aitken,

    • Secant,

    • MinimumPolynomial,

    #900

  • CouplingStudyAnalysis has a new method generate_coupling_graph.

  • The CLI gemseo-study generates the condensed and full coupling graphs as PDF files. #910

  • The check_disciplines_consistency function checks if two disciplines compute the same output and raises an error or logs a warning message if this is the case.

  • MDOCouplingStructure logs a message with WARNING level if two disciplines compute the same output. #912

  • The default value of an input variable of a LinearCombination is zero. #913

  • BaseFactory.create supports positional arguments. #918

  • The algorithms of a DriverLibrary have a new option "log_problem" (default: True). Set it to False so as not to log the sections related to an optimization problem, namely the problem definition, the optimization result and the evolution of the objective value. This can be useful when a DOEScenario is used as a pure sampling scenario. #925

  • SensitivityAnalysis.plot_bar and SensitivityAnalysis.plot_radar have new arguments sort and sorting_output to sort the uncertain variables by decreasing order of the sensitivity indices associated with a sorting output variable.

  • DatasetPlot has a new argument xtick_rotation to set the rotation angle of the x-ticks for a better readability when the number of ticks is important. #930

  • SensitivityAnalysis.to_dataset stores the second-order Sobol' indices in the dictionary Dataset.misc with the key "second". #936

  • The string representation of a ComposedDistribution uses both the string representations of the marginals and the string representation of the copula.

  • The string representation of a Distribution uses both the string representation of its parameters and its dimension when the latter is greater than 1. #937

  • The default value of the argument outputs of the methods plot_bar and plot_radar of SensitivityAnalysis is (). In this case, the SensitivityAnalysis uses all the outputs. #941

  • N2HTML can use any sized default inputs (NumPy arrays, lists, tuples, ...) to deduce the size of the input variables. #945

Fixed#

  • Fix the MDA residual scaling strategy based on sub-residual norms. #957

  • An XDSM can now take into account several levels of nested scenarios as well as nested MDA. An XDSM with a nested Scenario can also take into account more complex formulations than DisciplinaryOpt, such as MDF. #687

  • The properties of the JSONGrammar created by BaseFactory.get_options_grammar are no longer required. #772

  • If time_vector is not user-specified, then it is generated by the solver. As such, the array generated by the solver belongs in the ODEResult. #778

  • Fix plot at the end of the Van der Pol tutorial illustrating an ODEProblem. #806

  • The high-level function create_dataset returns a base Dataset by default. #823

  • SurrogateDiscipline.__str__ is less verbose by inheriting from MDODiscipline; use SurrogateDiscipline.__repr__ instead of the older SurrogateDiscipline.__repr__. #837

  • OptHistoryView can be executed with variable_names=None to explicitly display all the design variables.

  • The variable names specified with the argument variable_names of OptHistoryView are correctly considered.

  • OptimizationProblem.from_hdf sets pb_type and differentiation_method as string.

  • OptHistoryView, ObjConstrHist and ConstraintsHistory display a limited number of iterations on the x-axis to make it more readable by avoiding xtick overlay.

  • DesignSpace has a new property names_to_indices defining the design vector indices associated with variable names. #838

  • execute_post can post-process a Path. #846

  • The MDA chain can change at once the max_mda_iter of all its MDAs. The behaviour of the max_mda_iter of this class has been changed to do so. #848

  • The methods to_dataset build Dataset objects in one go instead of adding variables one by one. #852

  • CorrelationAnalysis and SobolAnalysis use the input names in the order provided by the ParameterSpace. #853

  • The RunFolderManager can now work with a non-empty output_folder_basepath when using folders_iter = FoldersIter.NUMBERED. Their name can be different from a number. #865

  • The argument output_names of MorrisAnalysis works properly again. #866

  • The argument n_samples passed to MorrisAnalysis is correctly taken into account. #869

  • DOELibrary works when the design variables have no default value. #870

  • The generation of XDSM diagrams for MDA looping over MDOScenarios. #879

  • BarPlot handles now correctly a Dataset whose number of rows is higher than the number of variables. #880

  • The DOE algorithms consider the optional seed when it is equal to 0 and use the driver's one when it is missing. #886

  • PCERegressor now handles multidimensional random input variables. #895

  • get_all_inputs and get_all_outputs return sorted names and so are now deterministic. #901

  • OptHistoryView no longer logs a warning when post-processing an optimization problem whose objective gradient history is empty. #902

  • The string representation of an MDOFunction is now correct even after several sign changes. #917

  • The sampling phase of a SensitivityAnalysis no longer reproduces the full log of the DOEScenario. Only the disciplines, the MDO formulation and the progress bar are considered. #925

  • The Correlations plot now labels its subplots correctly when the constraints of the optimization problem include an offset. #931

  • The string representation of a Distribution no longer sorts the parameters. #935

  • SobolAnalysis can export the indices to a Dataset, even when the second-order Sobol' indices are computed. #936

  • One can no longer add two random variables with the same name in a ParameterSpace. #938

  • SensitivityAnalysis.plot_bar and SensitivityAnalysis.plot_radar use all the outputs when the argument outputs is empty (e.g. None, "" or ()). #941

  • A DesignSpace containing a design variable without current value can be used to extend another DesignSpace. #947

  • Security vulnerability when calling subprocess.run with shell=True. #948

Changed#

  • Distribution: the default value of variable is "x"; same for OTDistribution, SPDistribution and their sub-classes.

  • SPDistribution: the default values of interfaced_distribution and parameters are uniform and {}.

  • OTDistribution: the default values of interfaced_distribution and parameters are Uniform and (). #551

  • The high-level function create_dataset raises a ValueError when the file has a wrong extension. #721

  • The performance of MDANewtonRaphson was improved. #791

  • The classes KMeans use "auto" as default value for the argument n_init of the scikit-learn's KMeans class. #825

  • output_names was added to MDOFunction.DICT_REPR_ATTR in order for it to be exported when saving to an hdf file. #860

  • OptimizationProblem.minimize_objective is now a property that changes the sign of the objective function if needed. #909

  • The name of the MDOFunction resulting from the sum (resp. subtraction, multiplication, division) of two MDOFunction s named "f" and "g" is "[f+g]" (resp. "[f-g]" , "[f*g]" , "[f/g]").

  • The name of the MDOFunction defined as the opposite of the MDOFunction named "f" is -f.

  • In the expression of an MDOFunction resulting from the multiplication or division of MDOFunction s, the expression of an operand is now grouped with round parentheses if this operand is a sum or subtraction. For example, for "f(x) = 1+x" and "g(x) = x" the resulting expression for f*g is "[f*g](x) = (1+x)*x".

  • The expression of the MDOFunction defined as the opposite of itself is -(expr). #917

  • Renamed MLQualityMeasure.evaluate_learn to MLQualityMeasure.compute_learning_measure.

  • Renamed MLQualityMeasure.evaluate_test to MLQualityMeasure.compute_test_measure.

  • Renamed MLQualityMeasure.evaluate_kfolds to MLQualityMeasure.compute_cross_validation_measure.

  • Renamed MLQualityMeasure.evaluate_loo to MLQualityMeasure.compute_leave_one_out_measure.

  • Renamed MLQualityMeasure.evaluate_bootstrap to MLQualityMeasure.compute_bootstrap_measure. #920

  • The argument use_shell of the discipline DiscFromExe is no longer taken into account, executable are now always executed without shell. #948

Version 5.0.1 (2023-09-07)#

Added#

  • The MDAJacobi performance and memory usage was improved. #882

Fixed#

  • The MDAJacobi executions are now deterministic. The MDAJacobi m2d acceleration is deactivated when the least square problem is not well solved. #882

Version 5.0.0 (2023-06-02)#

Main GEMSEO.API breaking changes#

  • The high-level functions defined in gemseo.api have been moved to gemseo.

  • Features have been extracted from GEMSEO and are now available in the form of plugins:

    • gemseo.algos.opt.lib_pdfo has been moved to gemseo-pdfo, a GEMSEO plugin for the PDFO library,

    • gemseo.algos.opt.lib_pseven has been moved to gemseo-pseven, a GEMSEO plugin for the pSeven library,

    • gemseo.wrappers.matlab has been moved to gemseo-matlab, a GEMSEO plugin for MATLAB,

    • gemseo.wrappers.template_grammar_editor has been moved to gemseo-template-editor-gui, a GUI to create input and output file templates for DiscFromExe.

Added#

  • PCERegressor has new arguments:

    • use_quadrature to estimate the coefficients by quadrature rule or least-squares regression.

    • use_lars to get a sparse PCE with the LARS algorithm in the case of the least-squares regression.

    • use_cleaning and cleaning_options to apply a cleaning strategy removing the non-significant terms.

    • hyperbolic_parameter to truncate the PCE before training.

    #496

  • The argument scale of PCA allows to scale the data before reducing their dimension. #743

  • GradientSensitivity plots the positive derivatives in red and the negative ones in blue for easy reading. #725

  • TopologyView allows to visualize the solution of a 2D topology optimization problem. #739

  • ConstraintsHistory uses horizontal black dashed lines for tolerance. #664

  • Animation is a new OptPostProcessor to generate an animated GIF from a OptPostProcessor. #740

  • JSchedulerDisciplineWrapper can submit the execution of disciplines to a HPC job scheduler. #613

  • MDODiscipline has now a virtual execution mode; when active, MDODiscipline.execute returns its MDODiscipline.default_outputs, whatever the inputs. #558

  • Improve the computation of MDA residuals with the following new strategies:

    • each sub-residual is scaled by the corresponding initial norm,

    • each component is scaled by the corresponding initial component,

    • the Euclidean norm of the component-wise division by initial residual scaled by the problem size.

    #780

  • OTComposedDistribution can consider any copula offered by OpenTURNS. #655

  • Scenario.xdsmize returns a XDSM; its XDSM.visualize method displays the XDSM in a web browser; this object has also a HTML view. #564

  • Add a new grammar type based on Pydantic: PydanticGrammar. This new grammar is still experimental and subject to changes, use with cautions. #436

  • XLSStudyParser has a new argument has_scenario whose default value is True; if False, the sheet Scenario is not required.

  • CouplingStudyAnalysis allows to generate an N2 diagram from an XLS file defining the disciplines in terms of input and output names.

  • MDOStudyAnalysis allows to generate an N2 diagram and an XDSM from an XLS file defining an MDO problem in terms of disciplines, formulation, objective, constraint and design variables. #696

  • JSONGrammar can validate PathLike objects. #759

  • Enable sparse matrices in the utils.comparisons module. #779

  • The method MDODiscipline._init_jacobian now supports sparse matrices.

  • Stopping options "max_time" and "stop_crit_n_x" can now be used with the global optimizers of SciPy ("DIFFERENTIAL_EVOLUTION", "DUAL_ANNEALING" and "SHGO"). #663

  • Add exterior penalty approach to reformulate OptimizationProblem with constraints into one without constraints. #581

  • Documentation: the required parameters of optimization, DOE and linear solver algorithms are documented in dedicated sections. #680

  • The MDOLinearFunction expression can be passed as an argument to the instantiation. This can be useful for large numbers of inputs or outputs to avoid long computation times for the expression string. #697

  • Enable sparse coefficients for MDOLinearFunction. #756

  • SobolAnalysis provides the SobolAnalysis.output_variances and SobolAnalysis.output_standard_deviations. SobolAnalysis.unscale_indices allows to unscale the Sobol' indices using SobolAnalysis.output_variances or SobolAnalysis.output_standard_deviations. SobolAnalysis.plot now displays the variance of the output variable in the title of the graph. #671

  • CorrelationAnalysis proposes two new sensitivity methods, namely Kendall rank correlation coefficients (CorrelationAnalysis.kendall) and squared standard regression coefficients (CorrelationAnalysis.ssrc). #654

  • Factory for algorithms (BaseAlgoFactory) can cache the algorithm libraries to provide speedup. #522

  • When keep_opt_history=True, the databases of a MDOScenarioAdapter can be exported in HDF5 files. #607

  • The argument use_deep_copy has been added to the constructor of MDOParallelChain class. This controls the use of deepcopy when running MDOParallelChain. By default this is set to False, as a performance improvement has been observed in use cases with a large number of disciplines. The old behaviour of using a deep copy of MDOParallelChain.local_data can be enabled by setting this option to True. This may be necessary in some rare combination of MDOParallelChain and other disciplines that directly modify the MDODiscipline.input_data. #527

  • Added a new RunFolderManager to generate unique run directory names for DiscFromExe, either as successive integers or as UUID's. #648

  • ScenarioAdapter is a Factory of MDOScenarioAdapter. #684

  • A new MDOWarmStartedChain allows users to warm start some inputs of the chain with the output values of the previous run. #665

  • The method Dataset.to_dict_of_arrays converts a Dataset into a dictionary of NumPy arrays indexed by variable names or group names. #793

Fixed#

  • MinMaxScaler and StandardScaler handle constant data without RuntimeWarning. #719

  • The different kinds of OptPostProcessor displaying iteration numbers start counting at 1. #601

  • The option fig_size passed to OptPostProcessor.execute is now taken into account. #641

  • The subplots of ConstraintsHistory use their own y-limits. #656

  • The visualization ParallelCoordinates uses the names of the design variables defined in the DesignSpace instead of default ones. #675

  • MDODiscipline.linearize with compute_all_jacobians=False (default value) computes the Jacobians only for the inputs and outputs defined with MDODiscipline.add_differentiated_inputs and MDODiscipline.add_differentiated_outputs if any; otherwise, it returns an empty dictionary; if compute_all_jacobians=True, it considers all the inputs and outputs. #644

  • The bug concerning the linearization of a MDOScenarioAdapter including disciplines that depends both only on MDOScenarioAdapter inputs and that are linearized in the MDOScenarioAdapter._run method is solved. Tests concerning this behavior where added. #651

  • AutoPyDiscipline can wrap a Python function with multiline return statements. #661

  • Modify the computation of total derivatives in the presence of state variables to avoid unnecessary calculations. #686

  • Modify the default linear solver calling sequence to prevent the use of the splu function on SciPy LinearOperator objects. #691

  • Fix Jacobian of MDOChain including Splitter disciplines. #764

  • Corrected typing issues that caused an exception to be raised when a custom parser was passed to the DiscFromExe at instantiation. #767

  • The method MDODiscipline._init_jacobian when fill_missing_key=True now creates the missing keys. #782

  • It is now possible to pass a custom name to the XLSDiscipline at instantiation. #788

  • get_available_mdas no longer returns the abstract class MDA. #795

  • OptimizationProblem.to_dataset uses the order of the design variables given by the ParameterSpace to build the Dataset. #626

  • Database.get_complete_history raises a ValueError when asking for a non-existent function. #670

  • The DOE algorithm OT_FACTORIAL handles correctly the tuple of parameters (levels, centers); this DOE algorithm does not use n_samples. The DOE algorithm OT_FULLFACT handles correctly the use of n_samples as well as the use of the parameters levels; this DOE algorithm can use either n_samples or levels. #676

  • The required properties are now available in the grammars of the DOE algorithms. #680

  • The stopping criteria for the objective function variation are only activated if the objective value is stored in the database in the last iterations. #692

  • The GradientApproximator and its subclasses no longer include closures preventing serialization. #700

  • A constraint aggregation MDOFunction is now capable of dealing with complex ndarray inputs. #716

  • Fix OptimizationProblem.is_mono_objective that returned wrong values when the objective had one outvars but multidimensional. #734

  • Fix the behavior of DesignSpace.filter_dim method for list of indices containing more than one index. #746

  • SensitivityAnalysis.to_dataset works correctly with several methods and the returned Dataset can be exported to a DataFrame. #640

  • OTDistribution can now truncate a probability distribution on both sides. #660

  • The method OptProblem.constraint_names is now built on fly from the constraints. This fixes the issue of the updating of the constraint names when the constraints are modified, as it is the case with the aggregation of constraints. #669

  • Factory considers the base class as an available class when it is not abstract. #685

  • Serialization of paths in disciplines attributes and local_data in multi OS. #711

Changed#

  • JSONGrammar no longer merge the definition of a property with the dictionary-like update methods. Now the usual behavior of a dictionary will be used such that the definition of a property is overwritten. The previous behavior can be used by passing the argument merge = True. #708

  • CorrelationAnalysis no longer proposes the signed standard regression coefficients (SSRC), as it has been removed from openturns. #654

  • Splitter, Concatenater, DensityFilter, and MaterialModelInterpolation disciplines use sparse Jacobians. #745

  • The minimum value of the seed used by a DOE algorithm is 0. #727

  • Parametric gemseo.problems.scalable.parametric.scalable_problem.ScalableProblem:

    • The configuration of the scalable disciplines is done with ScalableDisciplineSettings.

    • The method gemseo.problems.scalable.parametric.scalable_problem.ScalableProblem.create_quadratic_programming_problem returns the corresponding quadratic programming (QP) problem as an OptimizationProblem.

    • The argument alpha (default: 0.5) defines the share of feasible design space.

    #717

API changes#

See Upgrading GEMSEO for more information.

Version 4.3.0 (2023-02-09)#

Added#

  • Statistics.compute_joint_probability computes the joint probability of the components of random variables while Statistics.compute_probability computes their marginal ones. #542

  • MLErrorMeasure can split the multi-output measures according to the output names. #544

  • SobolAnalysis.compute_indices has a new argument to change the level of the confidence intervals. #599

  • MDOInitializationChain can compute the input data for a MDA from incomplete default_inputs of the disciplines. #610

  • Add a new execution status for disciplines: "STATUS_LINEARIZE" when the discipline is performing the linearization. #612

  • ConstraintsHistory:

    • One can add one point per iteration on the blue line (default behavior).

    • The line style can be changed (dashed line by default).

    • The types of the constraint are displayed.

    • The equality constraints are plotted with the OptPostProcessor.eq_cstr_cmap.

    #619

  • Users can now choose whether the OptimizationProblem.current_iter should be set to 0 before the execution of an OptimizationProblem passing the algo option reset_iteration_counters. This is useful to complete the execution of a Scenario from a backup file without exceeding the requested max_iter or n_samples. #636

Fixed#

  • HDF5Cache.hdf_node_name returns the name of the node of the HDF file in which the data are cached. #583

  • The histories of the objective and constraints generated by OptHistoryView no longer return an extra iteration. #591

  • The histories of the constraints and diagonal of the Hessian matrix generated by OptHistoryView use the scientific notation. #592

  • ObjConstrHist correctly manages the objectives to maximize. #594

  • Statistics.n_variables no longer corresponds to the number of variables in the Statistics.dataset but to the number of variables considered by Statistics. ParametricStatistics correctly handles variables with dimension greater than one. ParametricStatistics.compute_a_value uses 0.99 as coverage level and 0.95 as confidence level. #597

  • The input data provided to the discipline by a DOE did not match the type defined in the design space. #606

  • The cache of a self-coupled discipline cannot be exported to a dataset. #608

  • The ConstraintsHistory draws the vertical line at the right position when the constraint is satisfied at the final iteration. #616

  • Fixed remaining time unit inconsistency in progress bar. #617

  • The attribute fig_size of save_show_figure impacts the figure when show is True. #618

  • Transformer handles both 1D and 2D arrays. #624

  • SobolAnalysis no longer depends on the order of the variables in the ParameterSpace. #626

  • ParametricStatistics.plot_criteria plots the confidence level on the right subplot when the fitting criterion is a statistical test. #627

  • CorrelationAnalysis.sort_parameters uses the rule "The higher the absolute correlation coefficient the better". #628

  • Fix the parallel execution and the serialization of LinearCombination discipline. #638

  • Fix the parallel execution and the serialization of ConstraintAggregation discipline. #642

Changed#

  • Statistics.compute_probability computes one probability per component of the variables. #542

  • The history of the diagonal of the Hessian matrix generated by OptHistoryView displays the names of the design variables on the y-axis. #595

  • QuadApprox now displays the names of the design variables. #596

  • The methods SensitivityAnalysis.plot_bar and SensitivityAnalysis.plot_comparison of SensitivityAnalysis uses two decimal places by default for a better readability. #603

  • BarPlot uses a grid for a better readability. SobolAnalysis.plot uses a grid for a better readability. MorrisAnalysis.plot uses a grid for a better readability. #604

  • Dataset.export_to_dataframe can either sort the columns by group, name and component, or only by group and component. #622

  • OptimizationProblem.export_to_dataset uses the order of the design variables given by the ParameterSpace to build the Dataset. #626

Version 4.2.0 (2022-12-22)#

Added#

  • Add a new property to MatlabDiscipline in order to get access to the MatlabEngine instance attribute. #536

  • Independent MDA in a MDAChain can be run in parallel. #587

  • The MDAChain has now an option to run the independent branches of the process in parallel.

  • The Ishigami use case to illustrate and benchmark UQ techniques (IshigamiFunction, IshigamiSpace, IshigamiProblem and IshigamiDiscipline). #517

  • An MDODiscipline can now be composed of MDODiscipline.disciplines. #520

  • SobolAnalysis can compute the SobolAnalysis.second_order_indices. SobolAnalysis uses asymptotic distributions by default to compute the confidence intervals. #524

  • PCERegressor has a new attribute PCERegressor.second_sobol_indices. #525

  • The DistributionFactory has two new methods: DistributionFactory.create_marginal_distribution and DistributionFactory.create_composed_distribution. #526

  • SobieskiProblem has a new attribute USE_ORIGINAL_DESIGN_VARIABLES_ORDER to order the design variables of the SobieskiProblem.design_space according to their original order ("x_shared", "x_1", "x_2" and "x_3") rather than the gemseo one ("x_shared", "x_1", "x_2" and "x_3"), as SobieskiProblem and SobieskiBase are based on this original order. #550

Fixed#

  • Fix the XDSM workflow of a sequential sequence within a parallel sequence. #586

  • Factory no longer considers abstract classes. #280

  • When the DOELibrary.execute is called twice with different DOEs, the functions attached to the OptimizationProblem are correctly sampled during the second execution and the results correctly stored in the Database. #435

  • A ParameterSpace prevents the mixing of probability distributions coming from different libraries. #495

  • MinMaxScaler and StandardScaler can now deal with constant variables. #512

  • The options use_database, round_ints and normalized_design_space passed to DriverLib.execute are no longer ignored. #537

  • OptimizationProblem casts the complex numbers to real when exporting its OptimizationProblem.database to a Dataset. #546

  • PCERegressor computes the Sobol' indices for all the output dimensions. #557

  • Fixed a bug in HDF5FileSingleton that caused the HDF5Cache to crash when writing data that included arrays of string. #559

  • OptProblem.get_violation_criteria is inf for constraints with NaN values. #561

  • Fixed a bug in the iterations progress bar, that displayed inconsistent objective function and duration values. #562

  • NormFunction and NormDBFunction now use the MDOFunction.special_repr of the original MDOFunction. #568

  • DOEScenario and MDOScenario can be serialized after an execution. Added missing _ATTR_TO_SERIALIZE to MDOChain and MDOScenarioAdapter. #578

Changed#

  • Since version 4.1.0, when using a DOE, an integer variable passed to a discipline is casted to a floating point. The previous behavior will be restored in version 4.2.1.

  • The batches requested by pSeven are evaluated in parallel. #207

  • The LagrangeMultipliers of a non-solved OptimizationProblem can be approximated. The errors raised by LagrangeMultipliers are now raised by PostOptimalAnalysis. #372

  • The jacobian computation in MDOChain now uses the minimal jacobians of the disciplines instead of the force_all option of the disciplines linearization. #531

  • The jacobian computation in MDA now uses the minimal jacobians of the disciplines instead of all couplings for the disciplines linearization. #483

  • The Scenario.set_differentiation_method now casts automatically all float default inputs of the disciplines in its formulation to complex when using OptimizationProblem.COMPLEX_STEP and setting the option cast_default_inputs_to_complex to True. The Scenario.set_differentiation_method now casts automatically the current value of the DesignSpace to complex when using OptimizationProblem.COMPLEX_STEP. The MDODiscipline.disciplines is now a property that returns the protected attribute MDODiscipline._disciplines. #520

  • The methods MDODiscipline.add_differentiated_inputs and MDODiscipline.add_differentiated_outputs now ignore inputs or outputs that are not numeric. #548

  • MLQualityMeasure uses True as the default value for fit_transformers, which means that the Transformer instances attached to the assessed MLAlgo are re-trained on each training subset of the cross-validation partition. MLQualityMeasure.evaluate_kfolds uses True as default value for randomize, which means that the learning samples attached to the assessed MLAlgo are shuffled before building the cross-validation partition. #553

Version 4.1.0 (2022-10-25)#

Added#

  • MakeFunction has a new optional argument names_to_sizes defining the sizes of the input variables. #252

  • DesignSpace.initialize_missing_current_values sets the missing current design values to default ones. OptimizationLibrary initializes the missing design values to default ones before execution. #299

  • Boxplot is a new DatasetPlot to create boxplots from a Dataset. #320

  • Scenario offers an keyword argument maximize_objective, previously passed implicitly with **formulation_options. #350

  • A stopping criterion based on KKT condition residual can now be used for all gradient-based solvers. #372

  • The static N2 chart represents the self-coupled disciplines with blue diagonal blocks. The dynamic N2 chart represents the self-coupled disciplines with colored diagonal blocks. #396

  • SimpleCache can be exported to a Dataset. #404

  • A warning message is logged when an attempt is made to add an observable twice to an OptimizationProblem and the addition is cancelled. #409

  • A SensitivityAnalysis can be saved on the disk (use SensitivityAnalysis.save and SensitivityAnalysis.load). A SensitivityAnalysis can be loaded from the disk with the function load_sensitivity_analysis. #417

  • The PCERegressor has new properties related to the PCE output, namely its PCERegressor.mean, PCERegressor.covariance, PCERegressor.variance and PCERegressor.standard_deviation. #428

  • Timer can be used as a context manager to measure the time spent within a with statement. #431

  • Computation of KKT criteria is made optional. #440

  • Bievel processes now store the local optimization history of sub-scenarios in ScenarioAdapters. #441

  • pretty_str converts an object into an readable string by using str. #442

  • The functions create_linear_approximation and create_quadratic_approximation computes the first- and second-order Taylor polynomials of an MDOFunction. #451

  • The KKT norm is added to database when computed. #457

  • MDAs now output the norm of residuals at the end of its execution. #460

  • pretty_str and pretty_repr sort the elements of collections by default. #469

  • The module gemseo.algos.doe.quality offers features to assess the quality of a DOE:

    • DOEQuality assesses the quality of a DOE from DOEMeasures; the qualities can be compared with logical operators.

    • compute_phip_criterion computes the \varphi_p space-filling criterion.

    • compute_mindist_criterion computes the minimum-distance space-filling criterion.

    • compute_discrepancy computes different discrepancy criteria.

    #477

Fixed#

  • NLOPT_COBYLA and NLOPT_BOBYQA algorithms may end prematurely in the simplex construction phase, caused by an non-exposed and too small default value of the stop_crit_n_x algorithm option. #307

  • The MDANewton MDA does not have anymore a Jacobi step interleaved in-between each Newton step. #400

  • The AnalyticDiscipline.default_inputs do not share anymore the same Numpy array. #406

  • The Lagrange Multipliers computation is fixed for design points close to local optima. #408

  • gemseo-template-grammar-editor now works with both pyside6 and pyside2. #410

  • DesignSpace.read_from_txt can read a CSV file with a current value set at None. #411

  • The argument message passed to DriverLib.init_iter_observer and defining the iteration prefix of the ProgressBar works again; its default value is "...". #416

  • The signatures of MorrisAnalysis, CorrelationAnalysis and SobolAnalysis are now consistent with SensitivityAnalysis. #424

  • When using a unique process, the observables can now be evaluated as many times as the number of calls to DOELibrary.execute. #425

  • The DOELibrary.seed of the DOELibrary is used by default and increments at each execution; pass the integer option seed to DOELibrary.execute to use another one, the time of this execution. #426

  • DesignSpace.get_current_value correctly handles the order of the variable_names in the case of NumPy array outputs. #433

  • The SimpleCache no longer fails when caching an output that is not a Numpy array. #444

  • The first iteration of a MDA was not shown in red with MDA.plot_residual_history`. #455

  • The self-organizing map post-processing (SOM) has been fixed, caused by a regression. #465

  • The couplings variable order, used in the MDA class for the adjoint matrix assembly, was not deterministic. #472

  • A multidisciplinary system with a self-coupled discipline can be represented correctly by a coupling graph. #506

Changed#

  • The LoggingContext uses the root logger as default value of logger. #421

  • The GradientSensitivity post-processor now includes an option to compute the gradients at the selected iteration to avoid a crash if they are missing. #434

  • pretty_repr converts an object into an unambiguous string by using repr; use pretty_str for a readable string. #442

  • A global multi-processing manager is now used, this improves the performance of multiprocessing on Windows platforms. #445

  • The graphs produced by OptHistoryView use the same OptHistoryView.xlabel. #449

  • Database.notify_store_listener takes a design vector as input and when not provided the last iteration design vector is employed. The KKT criterion when kkt tolerances are provided is computed at each new storage. #457

Version 4.0.1 (2022-08-04)#

Added#

  • SimpleCache can be exported to a Dataset. #404

  • A warning message is logged when an attempt is made to add an observable twice to an OptimizationProblem and the addition is cancelled. #409

Fixed#

  • The MDANewton MDA does not have anymore a Jacobi step interleaved in-between each Newton step. #400

  • The AnalyticDiscipline.default_inputs do not share anymore the same Numpy array. #406

  • The Lagrange Multipliers computation is fixed for design points close to local optima. #408

  • gemseo-template-grammar-editor now works with both pyside6 and pyside2. #410

Version 4.0.0 (2022-07-28)#

Added#

  • Concatenater can now scale the inputs before concatenating them. LinearCombination is a new discipline computing the weighted sum of its inputs. Splitter is a new discipline splitting whose outputs are subsets of its unique input. #316

  • The transform module in machine learning now features two power transforms: BoxCox and YeoJohnson. #341

  • A MDODiscipline can now use a pandas DataFrame via its MDODiscipline.local_data. #58

  • Grammars can add namespaces to prefix the element names. #70

  • Disciplines and functions, with tests, for the resolution of 2D Topology Optimization problem by the SIMP approach were added in gemseo.problems.topo_opt. In the documentation, 3 examples covering L-Shape, Short Cantilever and MBB structures are also added. #128

  • A TransformerFactory. #154

  • The gemseo.post.radar_chart.RadarChart post-processor plots the constraints at optimum by default and provides access to the database elements from either the first or last index. #159

  • OptimizationResult can store the optimum index. #161

  • Changelog entries are managed by towncrier. #184

  • An OptimizationProblem can be reset either fully or partially (database, current iteration, current design point, number of function calls or functions preprocessing). Database.clear can reset the iteration counter. #188

  • The Database attached to a Scenario can be cleared before running the driver. #193

  • The variables of a DesignSpace can be renamed. #204

  • The optimization history can be exported to a Dataset from a Scenario. #209

  • A DatasetPlot can associate labels to the handled variables for a more meaningful display. #212

  • The bounds of the parameter length scales of a GaussianProcessRegressor can be defined at instantiation. #228

  • Observables included in the exported HDF file. #230

  • ScatterMatrix can plot a limited number of variables. #236

  • The Sobieski's SSBJ use case can now be used with physical variable names. #242

  • The coupled adjoint can now account for disciplines with state residuals. #245

  • Randomized cross-validation can now use a seed for the sake of reproducibility. #246

  • The DriverLib now checks if the optimization or DOE algorithm handles integer variables. #247

  • An MDODiscipline can automatically detect JSON grammar files from a user directory. #253

  • Statistics can now estimate a margin. #255

  • Observables can now be derived when the driver option eval_obs_jac is True (default: False). #256

  • ZvsXY can add series of points above the surface. #259

  • The number and positions of levels of a ZvsXY or Surfaces can be changed. #262

  • ZvsXY or Surfaces can use either isolines or filled surfaces. #263

  • A MDOFunction can now be divided by another MDOFunction or a number. #267

  • An MLAlgo cannot fit the transformers during the learning stage. #273

  • The KLSVD wrapped from OpenTURNS can now use the stochastic algorithms. #274

  • The lower or upper half of the ScatterMatrix can be hidden. #301

  • A Scenario can use a standardized objective in logs and OptimizationResult. #306

  • Statistics can compute the coefficient of variation. #325

  • Lines can use an abscissa variable and markers. #328

  • The user can now define a OTDiracDistribution with OpenTURNS. #329

  • It is now possible to select the number of processes on which to run an IDF formulation using the option n_processes. #369

Fixed#

  • Ensure that a nested MDAChain is not detected as a self-coupled discipline. #138

  • The method MDOCouplingStructure.plot_n2_chart no longer crashes when the provided disciplines have no couplings. #174

  • The broken link to the GEMSEO logo used in the D3.js-based N2 chart is now repaired. #184

  • An XLSDiscipline no longer crashes when called using multi-threading. #186

  • The option mutation of the "DIFFERENTIAL_EVOLUTION" algorithm now checks the correct expected type. #191

  • SensitivityAnalysis can plot a field with an output name longer than one character. #194

  • Fixed a typo in the monitoring section of the documentation referring to the function create_gantt_chart as create_gannt. #196

  • DOELibrary untransforms unit samples properly in the case of random variables. #197

  • The string representations of the functions of an OptimizationProblem imported from an HDF file do not have bytes problems anymore. #201

  • Fix normalization/unnormalization of functions and disciplines that only contain integer variables. #219

  • Factory.get_options_grammar provides the same content in the returned grammar and the dumped one. #220

  • Dataset uses pandas to read CSV files more efficiently. #221

  • Missing function and gradient values are now replaced with numpy.NaN when exporting a Database to a Dataset. #223

  • The method OptimizationProblem.get_data_by_names no longer crashes when both as_dict and filter_feasible are set to True. #226

  • MorrisAnalysis can again handle multidimensional outputs. #237

  • The XLSDiscipline test run no longer leaves zombie processes in the background after the execution is finished. #238

  • An MDAJacobi inside a DOEScenario no longer causes a crash when a sample raises a ValueError. #239

  • AnalyticDiscipline with absolute value can now be derived. #240

  • The function hash_data_dict returns deterministic hash values, fixing a bug introduced in GEMSEO 3.2.1. #251

  • LagrangeMultipliers are ensured to be non negative. #261

  • A MLQualityMeasure can now be applied to a MLAlgo built from a subset of the input names. #265

  • The given value in DesignSpace.add_variable is now cast to the proper var_type. #278

  • The DisciplineJacApprox.compute_approx_jac method now returns the correct Jacobian when filtering by indices. With this fix, the MDODiscipline.check_jacobian method no longer crashes when using indices. #308

  • An integer design variable can be added with a lower or upper bound explicitly defined as +/-inf. #311

  • A PCERegressor can now be deepcopied before or after the training stage. #340

  • A DOEScenario can now be serialized. #358

  • An AnalyticDiscipline can now be serialized. #359

  • N2JSON now works when a coupling variable has no default value, and displays "n/a" as variable dimension. N2JSON now works when the default value of a coupling variable is an unsized object, e.g. array(1). #388

  • The observables are now computed in parallel when executing a DOEScenario using more than one process. #391

Changed#

  • Fixed Lagrange Multipliers computation for equality active constraints. #345

  • The normalize argument of OptimizationProblem.preprocess_functions is now named is_function_input_normalized. #22

  • The gemseo.post.radar_chart.RadarChart post-processor uses all the constraints by default. #159

  • Updating a dictionary of NumPy arrays from a complex array no longer converts the complex numbers to the original data type except if required. #177

  • The D3.js-based N2 chart can now display the GEMSEO logo offline. #184

  • The default number of components used by a DimensionReduction transformer is based on data and depends on the related technique. #244

  • Classes deriving from MDODiscipline inherits the input and output grammar files of their first parent. #258

  • The parameters of a DatasetPlot are now passed at instantiation. #260

  • An MLQualityMeasure no longer trains an MLAlgo already trained. #264

  • Accessing a unique entry of a Dataset no longer returns 2D arrays but 1D arrays. Accessing a unique feature of a Dataset no longer returns a dictionary of arrays but an array. #270

  • MLQualityMeasure no longer refits the transformers with cross-validation and bootstrap techniques. #273

  • Improved the way xlwings objects are handled when an XLSDiscipline runs in multiprocessing, multithreading, or both. #276

  • A CustomDOE can be used without specifying algo_name whose default value is "CustomDOE" now. #282

  • The XLSDiscipline no longer copies the original Excel file when both copy_xls_at_setstate and recreate_book_at_run are set to True. #287

  • The post-processing algorithms plotting the objective function can now use the standardized objective when OptimizationProblem.use_standardized_objective is True. When post-processing a Scenario, the name of a constraint passed to the OptPostProcessor should be the value of constraint_name passed to Scenario.add_constraint or the vale of output_name if None. #302

  • An MDOFormulation now shows an INFO level message when a variable is removed from the design space because it is not an input for any discipline in the formulation. #304

  • It is now possible to carry out a SensitivityAnalysis with multiple disciplines. #310

  • The classes of the regression algorithms are renamed as {Prefix}Regressor. #322

  • The constructor of AutoPyDiscipline now allows the user to select a custom name instead of the name of the Python function. #339

  • It is now possible to serialize an MDOFunction. #342

  • All MDA algos now count their iterations starting from 0. The MDA.residual_history is now a list of normed residuals. The argument figsize in plot_residual_history was renamed to fig_size to be consistent with other OptPostProcessor algos. #343

API Changes#

See Upgrading GEMSEO for more information.

Version 3.2.2 (March 2022)#

Fixed#

  • Cache may not be used because of the way data was hashed.

Version 3.2.1 (November 2021)#

Fixed#

  • Missing package dependency declaration.

Version 3.2.0 (November 2021)#

Added#

  • The matrix linear problem solvers libraries are now handled by a Factory and can then be extended by plugins.

  • MDA warns if it stops when reaching max_mda_iter but before reaching the tolerance criteria.

  • The convergence of an MDA can be logged.

  • Add max line search steps option in scipy L-BFGS-B

  • An analytical Jacobian can be checked for subsets of input and output names and components.

  • An analytical Jacobian can be checked from a reference file.

  • Scipy global algorithms SHGO and differential evolution now handle non linear constraints.

  • It is now possible to get the number of constraints not satisfied by a design in an OptimizationProblem.

  • The names of the scalar constraints in an OptimizationProblem can be retrieved as a list.

  • The dimensions of the outputs for functions in an OptimizationProblem are now available as a dictionary.

  • The cross-validation technique can now randomize the samples before dividing them in folds.

  • The Scatter Plot Matrix post processor now allows the user to filter non-feasible points.

  • OptPostProcessor can change the size of the figures with the method execute().

  • SensitivityAnalysis can plot indices with values standardized in [0,1].

  • MorrisAnalysis provides new indices: minimum, maximum and relative standard deviation.

  • MorrisAnalysis can compute indices normalized with the empirical output bounds.

  • A button to change the tagged version of GEMSEO is available on the documentation hosted by Read the Docs.

  • The documentation now includes a link to the gemseo-scilab plugin.

  • ParetoFront: an example of a BiLevel scenario to compute the Pareto front has been added the examples.

  • A Pareto front computation example using a bi-level scenario has been added to the documentation.

  • The documentation now includes hints on how to use the add_observable method.

  • It is now possible to execute DOEScenarios in parallel on Windows. This feature does not support the use of MemoryFullCache or HDF5Cache on Windows. The progress bar may show duplicated instances during the initialization of each subprocess, in some cases it may also print the conclusion of an iteration ahead of another one that was concluded first. This is a consequence of the pickling process and does not affect the computations of the scenario.

  • A ParameterSpace can be casted into a DesignSpace.

  • Plugins can be discovered via setuptools entry points.

  • A dumped MDODiscipline can now be loaded with the API function import_discipline().

  • Database has a name used by OptimizationProblem to name the Dataset; this is the name of the corresponding Scenario if any.

  • The grammar type can be passed to the sub-processes through the formulations.

  • Scenario, MDOScenario and DOEScenario now include the argument grammar_type.

  • A GrammarFactory used by MDODiscipline allows to plug new grammars for data checking.

  • The coupling structure can be directly passed to an MDA.

  • Database has a name used by OptimizationProblem to name the Dataset; this is the name of the corresponding Scenario if any.

  • A dumped MDODiscipline can now be loaded with the API function import_discipline.

  • The name of an MDOScenarioAdapter can be defined at creation.

  • The AbstractFullCache built from a Dataset has the same name as the dataset.

  • The HDF5 file generated by HDF5Cache has now a version number.

Changed#

  • The IO grammar files of a scenario are located in the same directory as its class.

  • Distribution, ParameterSpace and OpenTURNS use now the logger mainly at debug level.

  • The grammar types "JSON" and "Simple" are replaced by the classes names "JSONGrammar" and "SimpleGrammar".

  • RadarChart uses the scientific notation as default format for the grid levels and allows to change the discretization of the grid.

Fixed#

  • Make OpenTURNS- and pyDOE-based full factorial DOEs work whatever the dimension and number of samples.

  • The NLopt library wrapper now handles user functions that return ndarrays properly.

  • Fix bilevel formulation: the strong couplings were used instead of all the couplings when computing the inputs and outputs of the sub-scenarios adapters. Please note that this bug had an impact on execution performance, but had no adverse effect on the bilevel calculations in previous builds.

  • Bug with the 'sample_x' parameter of the pSeven wrapper.

  • An OptimizationProblem can now normalize and unnormalize gradient with uncertain variables.

  • A SurrogateDiscipline can now be instantiated from an MLAlgo saved without its learning set.

  • Bug with the 'measure_options' arguments of MLAlgoAssessor and MLAlgoSelection.

  • The constraints names are now correctly formed with the minus sign and offset value if any.

  • DesignSpace no longer logs an erroneous warning when unnormalizing an unbounded variable.

  • Resampling-based MLQualityMeasure no longer re-train the original ML model, but a copy.

  • The computation of a diagonal DOE out of a design space does not crash anymore.

  • OptimizationProblem no longer logs a warning when using the finite-difference method on the design boundary.

  • OpenTURNS options are processed correctly when computing a DOE out of a design space.

  • The Correlations post-processor now sorts labels properly when two or more functions share the same name followed by an underscore.

  • The ParetoFront post-processor now shows the correct labels in the plot axis.

  • The Gantt Chart, Basic History, Constraints History and Scatter Plot Matrix pages in the documentation now render the example plots correctly.

  • Post-processings based on SymLogNorm (matplotlib) now works with Python 3.6.

  • OptHistoryView no longer raises an exception when the Hessian diagonal contains NaN and skips the Hessian plot.

  • Bug with inherited docstrings.

  • The MDO Scenario example subsections are now correctly named.

  • The data hashing strategy used by HDF5Cache has been corrected, old cache files shall have to be converted, see the FAQ.

  • Fix levels option for Full-Factorial doe: now this option is taken into account and enables to build an anisotropic sample.

  • The constraints names are now correctly formed with the minus sign and offset value if any.

  • Bug with the MATLAB discipline on Windows.

  • The SurrogateDiscipline can now be serialized.

  • The name used to export an OptimizationProblem to a Dataset is no longer mandatory.

  • Bug in the print_configuration method, the configuration table is now shown properly.

  • Bug with integer elements casted into

  • The image comparison tests in post/dataset no longer leave the generated files when completed.

  • Typo in the function name get_scenario_differenciation.

  • ImportError (backport.unittest_mock) on Python 2.7.

  • Backward compatibility with the legacy logger named "GEMSEO".

  • DOE algorithms now have their own JSON grammar files which corrects the documentation of their options.

  • DOEScenario no longer passes a default number of samples to a DOELibrary for which it is not an option.

  • Issues when a python module prefixed with gemseo_ is in the current working directory.

  • DesignSpace can now be iterated correctly.

  • The Jacobian approximated by the finite-difference method is now correct when computed with respect to uncertain variables.

  • The standard deviation predicted by GaussianProcessRegression is now correctly shaped.

  • The input data to stored in a HDF5Cache are now hashed with their inputs names.

  • The hashing strategy used by HDF5Cache no longer considers only the values of the dictionary but also the keys.

Version 3.1.0 (July 2021)#

Changed#

  • Faster JSON schema and dependency graph creation.

  • The Gradient Sensitivity post processor is now able to scale gradients.

  • MemoryFullCache can now use standard memory as well as shared memory.

  • Sellar1 and Sellar2 compute y_1 and y_2 respectively, for consistency of naming.

  • Improve checks of MDA structure.

  • IDF: add option to start at equilibrium with an MDA.

  • Improve doc of GEMSEO study.

  • Unified drivers stop criteria computed by GEMSEO (xtol_rel, xtol_abs, ftol_rel, ftom_abs).

  • SimpleGrammars supported for all processes (MDOChain, MDAChain etc.).

  • JSONGrammar can be converted to SimpleGrammar.

  • DiscFromExe can now run executables without using the shell.

  • It is now possible to add observable variables to the scenario class.

  • ParetoFront post-processing improvements: legends have been added, it is now possible to hide the non-feasible points in the plots.

  • The Gradient Sensitivity, Variable Influence and Correlations post processors now show variables names instead of hard-coded names.

  • The Correlations post processor now allows the user to select a subset of functions to plot.

  • The Correlations post processor now allows the user to select the figure size.

  • Documentation improvements.

Added#

  • Support for Python 3.9.

  • Support for fastjsonschema up to 2.15.1.

  • Support for h5py up to 3.2.1.

  • Support for numpy up to 1.20.3.

  • Support for pyxdsm up to 2.2.0.

  • Support for scipy to 1.6.3.

  • Support for tqdm up to 4.61.0.

  • Support for xdsmjs up to 1.0.1.

  • Support for openturns up to 1.16.

  • Support for pandas up to 1.2.4.

  • Support for scikit-learn up to 0.24.2.

  • Support for openpyxl up to 3.0.7.

  • Support for nlopt up to 2.7.0.

  • Constraint aggregation methods (KS, IKS, max, sum).

  • N2: an interactive web N2 chart allowing to expand or collapse the groups of strongly coupled disciplines.

  • Uncertainty: user interface for easy access.

  • Sensitivity analysis: an abstract class with sorting, plotting and comparison methods, with a dedicated factory and new features (correlation coefficients and Morris indices).

  • Sensitivity analysis: examples.

  • Concatenater: a new discipline to concatenate inputs variables into a single one.

  • Gantt chart generation to visualize the disciplines execution time.

  • An interactive web N2 chart allowing to expand or collapse the groups of strongly coupled disciplines.

  • Support pSeven algorithms for single-objective optimization.

  • DOELibrary.compute_doe computes a DOE based on a design space.

Fixed#

  • The greatest value that OT_LHSC can generate must not be 0.5 but 1.

  • Internally used HDF5 file left open.

  • The Scatter Plot Matrix post processor now plots the correct values for a subset of variables or functions.

  • MDA Jacobian fixes in specific cases (self-coupled, no strong couplings, etc).

  • Strong coupling definition.

  • Bi-level formulation implementation, following the modification of the strong coupling definition.

  • Graphviz package is no longer mandatory.

  • XDSM pdf generation bug.

  • DiscFromExe tests do not fail anymore under Windows, when using a network directory for the pytest base temporary directory.

  • No longer need quotation marks on gemseo-study string option values.

  • XDSM file generated with the right name given with outfilename.

  • SellarSystem works now in the Sphinx-Gallery documentation (plot_sellar.py).

Version 3.0.3 (May 2021)#

Changed#

  • Documentation fixes and improvements.

Version 3.0.2 (April 2021)#

Changed#

  • First open source release!

Fixed#

  • Dependency version issue for python 3.8 (pyside2).

Version 3.0.1 (April 2021)#

Fixed#

  • Permission issue with a test.

  • Robustness of the excel discipline wrapper.

Version 3.0.0 (January 2021)#

Added#

  • Licenses materials.

Changed#

  • Renamed gems package to gemseo.

Removed#

  • OpenOPT backend which is no longer maintained and has features overlap with other backends.

Fixed#

  • Better error handling of the study CLI with missing latex tools.

Version 2.0.1 (December 2020)#

Fixed#

Version 2.0.0 (July 2020)#

Added#

  • Support for Python3

  • String encoding: all the strings shall now be encoded in unicode. For Python 2 users, please read carefully the Python2 and Python3 compatibility note to migrate your existing GEMS scripts.

  • Documentation: gallery of examples and tutorials + cheat sheet

  • New conda file to automatically create a Python environment for GEMS under Linux, Windows and Mac OS.

  • ~35% improved performance on Python3

  • pyXDSM to generate latex/PDF XDSM

  • Display XDSM directly in the browser

  • Machine learning capabilities based on scikit-learn, OpenTURNS and scipy: clustering, classification, regression, dimension reduction, data scaling, quality measures, algorithm calibration.

  • Uncertainty package based on OpenTURNS and scipy: distributions, uncertain space, empirical and parametric statistics, Sobol' indices.

  • AbstractFullCache to cache inputs and outputs in memory

  • New Dataset class to store data from numpy arrays, file, Database and AbstractFullCache; Unique interface to machine learning and uncertainty algorithms.

  • Cache post-processing via Dataset

  • Make a discipline from an executable with a GUI

  • Excel-based discipline

  • Prototype a MDO study without writing any code and generating N2 and XDSM diagrams

  • Automatic finite difference step

  • Post-optimal analysis to compute the jacobian of MDO scenarios

  • Pareto front: computation and plot

  • New scalable problem from Tedford and Martins

  • New plugin mechanism for extension of features

Changed#

  • Refactored and much improved documentation

  • Moved to matplotlib 2.x and 3.x

  • Support for scipy 1.x

  • Improved API

  • Improved linear solvers robustness

  • Improved surrogate models based on machine learning capabilities and Dataset class.

  • Improved scalable models

  • Improved BasicHistory: works for design variables also

  • Improved XDSM diagrams for MDAChain

  • Improved BiLevel when no strong coupling is present

  • Improved overall tests

Fixed#

  • Bug in GradientSensitivity

  • Bug in AutoPyDiscipline for multiple returns and non pep8 code

Version 1.3.2 (December 2019)#

Fixed#

  • Bugfix in Discipline while updating data from the cache

Version 1.3.1 (July 2019)#

Added#

  • COBYLA handle NaNs values and manages it to backtrack. Requires specific mod of COBYLA by IRT

  • OptHistoryView and BasicHistory handle NaNs values

  • BasicHistory works for design variable values

Changed#

  • Improved error message when missing property in JSONGrammars

  • Improved imports to handle multiple versions of sklearn, pandas and sympy (thanks Damien Guenot)

Fixed#

  • Bug in Caching and Discipline for inouts (Thanks Romain Olivanti)

  • Bug in MDASequential convergence history

Version 1.3.0 (June 2019)#

Added#

  • Refactored and much improved documentation

  • All algorithms, MDAs, Surrogates, formulations options are now automatically documented in the HTML doc

  • Enhanced API: all MDO scenarios can be fully configured and run from the API

  • AutoPyDiscipline: faster way to wrap a Python function as a discipline

  • Surrogate models: Polynomial Chaos from OpenTurns

  • Surrogate model quality metrics:Leave one out, Q2, etc.

  • MDAs can handle self-coupled disciplines (inputs that are also outputs)

  • Lagrange Multipliers

  • Multi-starting point optimization as a bi-level scenario using a DOE

  • New aerostructure toy MDO problem

Changed#

  • Bi-Level formulation can now handle black box optimization scenarios, and external MDAs

  • Improve Multiprocessing and multithreading parallelism handling (avoid deadlocks with caches)

  • Improve performance of input / output data checks, x13 faster JSONGrammars

  • Improve performance of disciplines execution: avoid memory copies

  • Enhanced Scalable discipline, DOE is now based on a driver and inputs are read from a HDF5 cache like surrogate models

  • More readable N2 graph

  • Improved logging: fix issue with output files

  • Improved progress bar and adapt units for runtime prediction

  • NLOPT Cobyla: add control for init step of the DOE (rho)

  • Surrogate GPR: add options handling

Version 1.2.1 (August 2018)#

Added#

  • Handle integer variables in DOEs

Changed#

  • Improve performance of normalization/unnormalization

  • Improve x_xstar post processing to display the optimum

Fixed#

  • Issue to use external optimizers in a MDOScenario

Version 1.2.0 (July 2018)#

Added#

  • New API to ease the scenario creation and use by external platforms

  • mix parallelism multithreading / multiprocessing

  • much improved and unified plugin system with factories for Optimizers, DOE, MDAs, Formulations, Disciplines, Surrogates

  • Surrogate models interfaces

  • MDAJacobi is now much faster thanks to a new acceleration set of methods

Changed#

  • HTML documentation

  • Small improvements

Fixed#

  • Many bugs

Version 1.1.0 (April 2018)#

Added#

  • Mix finite differences in the discipline derivation and analytical jacobians or complex step to compute chain rule or adjoint method when not all disciplines' analytical derivatives are available

  • Ability to handle design spaces with integer variables

  • Analytic discipline based on symbolic calculation to easily create disciplines from analytic formulas

  • A scalable surrogate approximation of a discipline to benchmark MDO formulations

  • A HDF cache (= recorder) for disciplines to store all executions on the disk

  • The P-L-BFGS-B algorithm interface, a variant of LBFGSB with preconditioning coded in Python

  • Parallel (multiprocessing and / or multithreading) execution of disciplines and or call to functions

  • New constraints plot visualizations (radar chart) and constraints plot with values

  • Visualization to plot the distance to the best value in log scale ||x-x*||

  • Possibility to choose to normalize the design space or not for each variable

  • IDF improved for weakly coupled problems

  • On the fly backup of the optimization history (HDF5), in "append" mode

  • We can now monitor the convergence on the fly by creating optimization history plots at each iteration

  • Famous N2 plot in the CouplingStructure

  • Sphinx generated documentation in HTML (open doc/index.html), with:

    • GEMS in a nutshell tutorial

    • Discipline integration tutorial

    • Post processing description

    • GEMS architecture description

    • MDO formulations description

    • MDAs

Changed#

  • Improved automatically finding the best point in an optimization history

  • Improved callback functions during optimization / DOE

  • Improved stop criteria for optimization

  • Improved progress bar

  • Improved LGMRES solver for MDAs when using multiple RHS (recycle Krylov subspaces to accelerate convergence)

Fixed#

  • Many bugs

Version 1.0.0 (December 2016)#

Added#

  • Design of Experiment (DOE) capabilities from pyDOE, OpenTURNS or a custom samples set

  • Full differentiation of the process is available:

    • analytical gradient based optimization

    • analytical Newton type coupling solver for MDA (Multi Disciplinary Analyses)

    • analytical derivation of the chains of disciplines (MDOChain) via the chain rule

  • Post processing of optimization history: many plots to view the constraints, objective, design variables

  • More than 10 MDA (coupled problems) solver available, some gradient based (quasi newton) and hybrid multi-step methods (SequentialMDA) !

  • OptimizationProblem and its solution can be written to disk and post processed afterwards

  • Handling of DOE and optimization algorithm options via JSON schemas

  • Introduced an OptimizationProblem class that is created by the MDOFormulation and passed to an algorithm for resolution

  • Serialization mechanism for MDODiscipline and subclasses (write objects to disk)

  • Intensive testing: 500 tests and 98 % line coverage (excluding third party source)

  • Improved code coverage by tests from 95% to 98% and all modules have a coverage of at least 95%

  • Reduced pylint warnings from 800 to 40 !

Changed#

  • Code architecture refactoring for below items

  • Modularized post processing

  • Refactored algorithms part with factories

  • Removed dependency to json_schema_generator library, switched to GENSON (embedded with MIT licence)

  • Moved from JsonSchema Draft 3 to Draft 4 standard

  • Refactored the connection between the functions and the optimizers

  • Refactored MDOScenario

  • Refactored IDF formulation

  • Refactored Bilevel formulation

  • Refactored MDAs and introduced the CouplingStructure class

  • Refactored the DataProcessor for data interface with workflow engines

  • Refactored Sobieski use case to improve code quality

  • Included AGI remarks corrections on code style and best practices

Version 0.1.0 (April 2016)#

Added#

  • Basic MDO formulations: MDF, IDF, Bilevel formulations

  • Some optimization history views for convergence monitoring of the algorithm

  • Optimization algorithms: Scipy, OpenOPT, NLOPT

  • Possible export of the optimization history to the disk

  • Complex step and finite differences optimization

  • Benchmark cases:

    • Sobieski's Supersonic Business Jet MDO case

    • Sellar

    • Propane