gemseo_benchmark / report

Hide inherited members

report module

Generation of a benchmarking report.

class gemseo_benchmark.report.report.DirectoryName(value)[source]

Bases: Enum

The name of a report directory.

BUILD = '_build'
IMAGES = 'images'
PROBLEMS = 'problems'
RESULTS = 'results'
class gemseo_benchmark.report.report.FileName(value)[source]

Bases: Enum

The name of a report file.

ALGORITHMS = 'algorithms.rst'
ALGORITHMS_CONFIGURATIONS_GROUP = 'algorithms_configurations_group.rst'
DATA_PROFILE = 'data_profile.png'
HISTORIES = 'histories.png'
INDEX = 'index.rst'
PROBLEM = 'problem.rst'
PROBLEMS_LIST = 'problems_list.rst'
RESULTS = 'results.rst'
SUB_RESULTS = 'sub_results.rst'
class gemseo_benchmark.report.report.Report(root_directory_path, algos_configurations_groups, problems_groups, histories_paths, custom_algos_descriptions=None, max_eval_number_per_group=None)[source]

Bases: object

A benchmarking report.

Parameters:
  • root_directory_path (str | Path) – The path to the root directory of the report.

  • algos_configurations_groups (Iterable[AlgorithmsConfigurations]) – The groups of algorithms configurations.

  • problems_groups (Iterable[ProblemsGroup]) – The groups of reference problems.

  • histories_paths (Results) – The paths to the reference histories for each algorithm and reference problem.

  • custom_algos_descriptions (Mapping[str, str] | None) – Custom descriptions of the algorithms, to be printed in the report instead of the default ones coded in GEMSEO.

  • max_eval_number_per_group (dict[str, int] | None) – The maximum evaluations numbers to be displayed on the graphs of each group. The keys are the groups names and the values are the maximum evaluations numbers for the graphs of the group. If None, all the evaluations are displayed. If the key of a group is missing, all the evaluations are displayed for the group.

Raises:

ValueError – If an algorithm has no associated histories.

generate(to_html=True, to_pdf=False, infeasibility_tolerance=0.0, plot_all_histories=True, use_log_scale=False)[source]

Generate the benchmarking report.

Parameters:
  • to_html (bool) –

    Whether to generate the report in HTML format.

    By default it is set to True.

  • to_pdf (bool) –

    Whether to generate the report in PDF format.

    By default it is set to False.

  • infeasibility_tolerance (float) –

    The tolerance on the infeasibility measure.

    By default it is set to 0.0.

  • plot_all_histories (bool) –

    Whether to plot all the performance histories.

    By default it is set to True.

  • use_log_scale (bool) –

    Whether to use a logarithmic scale on the value axis.

    By default it is set to False.

Return type:

None