error_estimators module¶
Error estimators for computing derivatives.
- gemseo.utils.derivatives.error_estimators.compute_best_step(f_p, f_x, f_m, step, epsilon_mach=2.220446049250313e-16)[source]¶
Compute the optimal step for finite differentiation.
Applied to a forward first order finite differences gradient approximation.
Require a first evaluation of the perturbed functions values.
The optimal step is reached when the truncation error (cut in the Taylor development), and the numerical cancellation errors (round-off when doing \(f(x+step)-f(x))\) are equal.
See also
https://en.wikipedia.org/wiki/Numerical_differentiation and Numerical Algorithms and Digital Representation, Knut Morken, Chapter 11, “Numerical Differenciation”
- Parameters:
f_p (ndarray) – The value of the function \(f\) at the next step \(x+\\delta_x\).
f_x (ndarray) – The value of the function \(f\) at the current step \(x\).
f_m (ndarray) – The value of the function \(f\) at the previous step \(x-\\delta_x\).
step (float) – The differentiation step \(\\delta_x\).
epsilon_mach (float) –
By default it is set to 2.220446049250313e-16.
- Returns:
The estimation of the truncation error. None if the Hessian approximation is too small to compute the optimal step. The estimation of the cancellation error. None if the Hessian approximation is too small to compute the optimal step. The optimal step.
- Return type:
- gemseo.utils.derivatives.error_estimators.compute_cancellation_error(f_x, step, epsilon_mach=2.220446049250313e-16)[source]¶
Compute the cancellation error.
This is the round-off when doing \(f(x+\\delta_x)-f(x)\).
- Parameters:
- Returns:
The cancellation error.
- Return type:
- gemseo.utils.derivatives.error_estimators.compute_hessian_approximation(f_p, f_x, f_m, step)[source]¶
Compute the second-order approximation of the Hessian matrix \(d^2f/dx^2\).
- Parameters:
- Returns:
The approximation of the Hessian matrix at the current step \(x\).
- Return type: