As GEMS has been renamed to GEMSEO, upgrading from version 2 to version 3 requires to change all the import statements of your code from

import gems
from gems.x.y import z


to

import gemseo
from gemseo.x.y import z


The API of GEMS 2 has been slightly modified with respect to GEMS 1. In particular, for all the supported Python versions, the strings shall to be encoded in unicode while they were previously encoded in ASCII.

That kind of error:

ERROR - 17:11:09 : Invalid data in : MDOScenario_input
', error : data.algo must be string
Traceback (most recent call last):
File "plot_mdo_scenario.py", line 85, in <module>
scenario.execute({"algo": "L-BFGS-B", "max_iter": 100})
File "/home/distracted_user/workspace/gemseo/src/gemseo/core/discipline.py", line 586, in execute
self.check_input_data(input_data)
File "/home/distracted_user/workspace/gemseo/src/gemseo/core/discipline.py", line 1243, in check_input_data
raise InvalidDataException("Invalid input data for: " + self.name)
gemseo.core.grammar.InvalidDataException: Invalid input data for: MDOScenario


is most likely due to the fact that you have not migrated your code to be compliant with GEMSEO 2. To migrate your code, add the following import at the beginning of all your modules defining literal strings:

from __future__ import unicode_literals


## Create a simple DOE on a single discipline¶

Use the DisciplinaryOpt formulation and a DOEScenario scenario. Even for simple DOEs, GEMSEO formulates an optimization problem, so requires a MDO formulation. The DisciplinaryOpt formulation executes the MDODiscipline alone, or the list of MDODiscipline in the order passed by the user. This means that you must specify an objective function. The DOE won’t try to minimize it but it will be set as an objective in the visualizations.

For more details, we invite you to read our tutorial Tutorial: How to carry out a trade-off study.

## Create a simple optimization on a single discipline¶

Use the DisciplinaryOpt formulation and a MDOScenario. The DisciplinaryOpt formulation executes the MDODiscipline alone, or the list of MDODiscipline in the order passed by the user.

## Available options for DOE/Optimization¶

Look at the JSON schema with the name of the library or algorithm, in the gemseo/algos/doe/options or gemseo/algos/opt/options packages. Their list and meanings are also documented in the library wrapper (for instance gemseo.algos.opt.lib_scipy.ScipyOpt._get_options()).

## Coupling a simulation software to GEMSEO¶

We invite you to discover all the steps in this tutorial Tutorial: How to solve an MDO problem.

## What are JSON schemas?¶

JSON schemas describe the format (i.e. structure) of JSON files, in a similar way as XML schemas define the format of XML files. JSON schemas come along with validators, that check that a JSON data structure is valid against a JSON schema, this is used in GEMSEO’ Grammars.

We invite you to read our documentation: Input and output description: grammars.

All details about the JSON schema specification can be found here: Understanding JSON schemas.

## Store persistent data produced by disciplines¶

Use HDF5 caches to persist the discipline output on the disk.

We invite you to read our documentation: Caching and recording discipline data.

## Error when using a HDF5 cache¶

In GEMSEO 3.2.0, the storage of the data hashes in the HDF5 cache has been fixed and the previous cache files are no longer valid. If you get an error like The file cache.h5 cannot be used because it has no file format version: see HDF5Cache.update_file_format for converting it., please use HDF5Cache.update_file_format() to update the format of the file and fix the data hashes.

## GEMSEO fails with openturns¶

Openturns implicitely requires the library libnsl that may not be installed by default on recent linux OSes. Under CentOS for instance, install it with:

sudo yum install libnsl


## Some GEMSEO tests fail under Windows without any reason¶

The user may face some issues with the last version of Windows 10, build 2004, while running the tests. The errors are located deep in either numpy or scipy, while performing some low-level linear algebra operations. The root cause of this issue is well known and comes from an incompatibility with Windows 10, build 2004 and some versions of OpenBlas. GEMSEO users shall not encounter any issue in production. Otherwise, please contact us in order to get some mitigation instructions.

## Parallel execution limitations on Windows¶

When running parallel execution tasks on Windows, the features MemoryFullCache and HDF5Cache do not work properly. This is due to the way subprocesses are forked in this architecture. The method DOEScenario.set_optimization_history_backup() is recommended as an alternative.

The progress bar may show duplicated instances during the initialization of each subprocess, in some cases it may also print the conclusion of an iteration ahead of another one that was concluded first. This is a consequence of the pickling process and does not affect the computations of the scenario.