# Regression¶

## Regression model¶

The regression module implements regression algorithms, where the goal is to find relationships between continuous input and output variables. After being fitted to a learning set, Regression algorithms can predict output values of new input data.

A regression algorithm consists of identifying a function $$f: \mathbb{R}^{n_{\textrm{inputs}}} \to \mathbb{R}^{n_{\textrm{outputs}}}$$. Given an input point $$x \in \mathbb{R}^{n_{\textrm{inputs}}}$$, the predict method of the regression algorithm will return the output point $$y = f(x) \in \mathbb{R}^{n_{\textrm{outputs}}}$$. See supervised for more information.

Wherever possible, regression algorithms should also be able to compute the Jacobian matrix of the function it has learned to represent. Given an input point $$x \in \mathbb{R}^{n_{\textrm{inputs}}}$$, the Jacobian predict method of the regression algorithm should thus return the matrix

$\begin{split}J_f(x) = \frac{\partial f}{\partial x} = \begin{pmatrix} \frac{\partial f_1}{\partial x_1} & \cdots & \frac{\partial f_1} {\partial x_{n_{\textrm{inputs}}}}\\ \vdots & \ddots & \vdots\\ \frac{\partial f_{n_{\textrm{outputs}}}}{\partial x_1} & \cdots & \frac{\partial f_{n_{\textrm{outputs}}}} {\partial x_{n_{\textrm{inputs}}}} \end{pmatrix} \in \mathbb{R}^{n_{\textrm{outputs}}\times n_{\textrm{inputs}}}.\end{split}$

This concept is implemented through the MLRegressionAlgo class which inherits from the MLSupervisedAlgo class.

Available regression models are: