gemseo.mlearning.classification.algos.knn module#

The k-nearest neighbors for classification.

The k-nearest neighbor classification algorithm is an approach to predict the output class of a new input point by selecting the majority class among the k nearest neighbors in a training set through voting. The algorithm may also predict the probabilities of belonging to each class by counting the number of occurrences of the class withing the k nearest neighbors.

Let \((x_i)_{i=1,\cdots,n_{\text{samples}}}\in \mathbb{R}^{n_{\text{samples}}\times n_{\text{inputs}}}\) and \((y_i)_{i=1,\cdots,n_{\text{samples}}}\in \{1,\cdots,n_{\text{classes}}\}^{n_{\text{samples}}}\) denote the input and output training data respectively.

The procedure for predicting the class of a new input point \(x\in \mathbb{R}^{n_{\text{inputs}}}\) is the following:

Let \(i_1(x), \cdots, i_{n_{\text{samples}}}(x)\) be the indices of the input training points sorted by distance to the prediction point \(x\), i.e.

\[\|x-x_{i_1(x)}\| \leq \cdots \leq \|x-x_{i_{n_{\text{samples}}}(x)}\|.\]

The ordered indices may be formally determined through the inductive formula

\[i_p(x) = \underset{i\in I_p(x)}{\operatorname{argmin}}\|x-x_i\|,\quad p=1,\cdots,n_{\text{samples}}\]

where

\[\begin{split}I_1(x) = \{1,\cdots,n_{\text{samples}}\}\\ I_{p+1} = I_p(x)\setminus \{i_p(x)\},\quad p=1,\cdots,n_{\text{samples}}-1,\end{split}\]

that is

\[I_p(x) = \{1,\cdots,n_{\text{samples}}\}\setminus \{i_1(x),\cdots,i_{p-1}(x)\}.\]

Then, by denoting \(\operatorname{mode}(\cdot)\) the mode operator, i.e. the operator that extracts the element with the highest occurrence, we may define the prediction operator as the mode of the set of output classes associated to the \(k\) first indices (classes of the \(k\)-nearest neighbors of \(x\)):

\[f(x) = \operatorname{mode}(y_{i_1(x)}, \cdots, y_{i_k(x)})\]

This concept is implemented through the KNNClassifier class which inherits from the BaseClassifier class.

Dependence#

The classifier relies on the KNeighborsClassifier class of the scikit-learn library.

class KNNClassifier(data, settings_model=None, **settings)[source]#

Bases: BaseClassifier

The k-nearest neighbors classification algorithm.

Parameters:
  • data (Dataset) -- The training dataset.

  • settings_model (BaseMLAlgoSettings | None) -- The machine learning algorithm settings as a Pydantic model. If None, use **settings.

  • **settings (Any) -- The machine learning algorithm settings. These arguments are ignored when settings_model is not None.

Raises:

ValueError -- When both the variable and the group it belongs to have a transformer.

Settings#

alias of KNNClassifier_Settings

LIBRARY: ClassVar[str] = 'scikit-learn'#

The name of the library of the wrapped machine learning algorithm.

SHORT_ALGO_NAME: ClassVar[str] = 'KNN'#

The short name of the machine learning algorithm, often an acronym.

Typically used for composite names, e.g. f"{algo.SHORT_ALGO_NAME}_{dataset.name}" or f"{algo.SHORT_ALGO_NAME}_{discipline.name}".