Changelog¶
All notable changes of this project will be documented here.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
Version 3.0.2 (April 2021)¶
Changed¶
First open source release!
Fixed¶
Dependency version issue for python 3.8 (pyside2).
Version 3.0.1 (April 2021)¶
Fixed¶
Permission issue with a test.
Robustness of the excel discipline wrapper.
Version 3.0.0 (January 2021)¶
Added¶
Licenses materials
Changed¶
Renamed gems package to gemseo.
Removed¶
OpenOPT backend which is no longer maintained and has features overlap with other backends.
Fixed¶
Better error handling of the study CLI with missing latex tools.
Version 2.0.1 (December 2020)¶
Fixed¶
Improper configuration of the logger in the MDAChain test leading to GEMS crashes if the user has not write permission on the GEMS installation directory.
Max versions of h5py and Openturns defined in environment and configuration files to prevent incorrect environments due to API incompatibilites.
Max version of numpy defined in order to avoid the occurence of a fmod/OpenBlas bug with Windows 10 2004 (https://developercommunity.visualstudio.com/content/problem/1207405/fmod-after-an-update-to-windows-2004-is-causing-a.html).
Version 2.0.0 (July 2020)¶
Added¶
Support for Python3
String encoding: all the strings shall now be encoded in unicode. For Python 2 users, please read carefuly the Python2 and Python3 compatibility note to migrate your existing GEMS scripts.
Documentation: gallery of examples and tutorials + cheat sheet
New conda file to automatically create a Python environment for GEMS under Linux, Windows and Mac OS.
~35% improved performance on Python3
pyXDSM to generate latex/PDF XDSM
Display XDSM directly in the browser
Machine learning capabilities based on scikit-learn, OpenTURNS and scipy: clustering, classification, regression, dimension reduction, data scaling, quality measures, algorithm calibration.
Uncertainty package based on OpenTURNS and scipy: distributions, uncertain space, empirical and parametric statistics, Sobol’ indices.
AbstractFullCache to cache inputs and outputs in memory
New Dataset class to store data from numpy arrays, file, Database and AbstractFullCache; Unique interface to machine learning and uncertainty algorithms.
Cache post-processing via Dataset
Make a discipline from an executable with a GUI
Excel-based discipline
Prototype a MDO study without writing any code and generating N2 and XDSM diagrams
Automatic finite difference step
Post-optimal analysis to compute the jacobian of MDO scenarios
Pareto front: computation and plot
New scalable problem from Tedford and Martins
New plugin mechanism for extension of features
Changed¶
Refactored and much improved documentation
Moved to matplotlib 2.x and 3.x
Support for scipy 1.x
Improved API
Improved linear solvers robustness
Improved surrogate models based on machine learning capabilities and Dataset class.
Improved scalable models
Improved BasicHistory: works for design variables also
Improved XDSM diagrams for MDAChain
Improved BiLevel when no strong coupling is present
Improved overall tests
Fixed¶
Bug in GradientSensitivity
Bug in AutoPyDiscipline for multiple returns and non pep8 code
Version 1.3.1 (July 2019)¶
Added¶
COBYLA handle NaNs values and manages it to backtrack. Requires specific mod of COBYLA by IRT
OptHistoryView and BasicHistory handle NaNs values
BasicHistory works for design variable values
Changed¶
Improved error message when missing property in JSONGrammars
Improved imports to handle multiple versions of sklearn, pandas and sympy (thanks Damien Guenot)
Fixed¶
Bug in Caching and Discipline for inouts (Thanks Romain Olivanti)
Bug in MDASequential convergence hisotry
Version 1.3.0 (June 2019)¶
Added¶
Refactored and much improved documentation
All algorithms, MDAs, Surrogates, formulations options are now automatically documented in the HTML doc
Enhanced API: all MDO scenarios can be fully configured and run from the API
AutoPyDiscipline: faster way to wrap a Python function as a discipline
Surrogate models: Polynomial Chaos from OpenTurns
Surrogate model quality metrics:Leave one out, Q2, etc.
MDAs can handle self-coupled disciplines (inputs that are also outputs)
Lagrange Multipliers
Multi-starting point optimization as a bi-level scenario using a DOE
New aerostructure toy MDO problem
Changed¶
Bi-Level formulation can now handle black box optimization scenarios, and external MDAs
Improve Multiprocessing and multithreading parallelism handling (avoid deadlocks with caches)
Improve performance of input / output data checks, x13 faster JSONGrammars
Improve performance of disciplines execution: avoid memory copies
Enhanced Scalable discipline, DOE is now based on a driver and inputs are read from a HDF5 cache like surrogate models
More readable N2 graph
Improved logging: fix issue with output files
Improved progress bar and adapt units for runtime prediction
NLOPT Cobyla: add control for init step of the DOE (rho)
Surrogate GPR: add options handling
Version 1.2.1 (August 2018)¶
Added¶
Handle integer variables in DOEs
Changed¶
Improve performance of normalization/unnormalization
Improve x_xstar post processing to display the optimum
Fixed¶
Issue to use external optimizers in a MDOScenario
Version 1.2.0 (July 2018)¶
Added¶
New API to ease the scenario creation and use by external platforms
mix parallelism multithreading / multiprocessing
much improved and unified plugin system with factories for Optimizers, DOE, MDAs, Formulations, Disciplines, Surrogates
Surrogate models interfaces
MDAJacobi is now much faster thanks to a new acceleration set of methods
Changed¶
HTML documentation
Small improvements
Fixed¶
Many bugs
Version 1.1.0 (April 2018)¶
Added¶
Mix finite differences in the disicipline derivation and analytical jacobians or complex step to compute chain rule or adjoint method when not all disciplines’ analytical derivatives are available
Ability to handle design spaces with integer variables
Analytic discipline based on symbolic calculation to easily create disciplines from analytic formulas
A scalable surrogate approximation of a discipline to benchmark MDO formulations
A HDF cache (= recorder) for disciplines to store all executions on the disk
The P-L-BFGS-B algorithm interface, a variant of LBFGSB with preconditioning coded in Python
Parallel (multiprocessing and / or multithreading) execution of disciplines and or call to functions
New constraints plot visualizations (radar chart) and constraints plot with values
Visualization to plot the distance to the best value in log scale ||x-x*||
Possibility to choose to nomalize the design space or not for each variable
IDF improved for weakly coupled problems
On the fly backup of the optimization history (HDF5), in “append” mode
We can now monitor the convergence on the fly by creating optimization history plots at each iteration
Famous N2 plot in the CouplingStructure
Sphinx generated documention in HTML (open doc/index.html), with:
GEMS in a nutshell tutorial
Discipline integration tutorial
Post processing description
GEMS architecture description
MDO formulations description
MDAs
Changed¶
Improved automatically finding the best point in an optimization history
Improved callback functions during optimization / DOE
Improved stop criteria for optimization
Improved progress bar
Improved LGMRES solver for MDAs when using multiple RHS (recycle Krylov subspaces to accelerate convergence)
Fixed¶
Many bugs
Version 1.0.0 (December 2016)¶
Added¶
Design of Experiment (DOE) capabilities from pyDOE, OpenTURNS or a custom samples set
Full differentiation of the process is available:
analytical gradient based optimization
analytical Newton type coupling solver for MDA (Multi Disciplinary Analyses)
analytical derivation of the chains of disciplines (MDOChain) via the chain rule
Post processing of optimization history: many plots to view the constraints, objective, design variables
More than 10 MDA (coupled problems) solver available, some gradient based (quasi newton) and hybrid multi-step methods (SequantialMDA) !
OptimizationProblem and its solution can be written to disk and post processed afterwards
Handling of DOE and optimization algorithm options via JSON schemas
Introduced an OptimizationProblem class that is created by the MDOFormulation and passed to an algorithm for resolution
Serialization mechanism for MDODiscipline and subclasses (write objects to disk)
Intensive testing: 500 tests and 98 % line coverage (excluding third party source)
Improved code coverage by tests from 95% to 98% and all modules have a coverage of at least 95%
Reduced pylint warnings from 800 to 40 !
Changed¶
Code architecture refactoring for below items
Modularized post processing
Refactored algorithms part with factories
Removed dependency to json_shema_generator library, switched to GENSON (embeded with MIT licence)
Moved from JsonSchema Draft 3 to Draft 4 standard
Refactored the connection between the functions and the optimizers
Refactored MDOScenario
Refactored IDF formulation
Refactored Bilevel formulation
Refactored MDAs and introduced the CouplingStructure class
Refactored the DataProcessor for data interface with workflow engines
Refactored Sobieski use case to improve code quality
Included AGI remarks corrections on code style and best practices
Version 0.1.0 (April 2016)¶
Added¶
Basic MDO formulations: MDF, IDF, Bilevel formulations
Some optimization history views for convergence monitoring of the algorithm
Optimization algorithms: Scipy, OpenOPT, NLOPT
Possible export of the optimization history to the disk
Complex step and finite differences optimization
Benchmark cases:
Sobieski’s Supersonic Business Jet MDO case
Sellar
Propane