Version 3 (modified by hees, 16 years ago) (diff) |
---|
Tutorial About Probabilistic Learning Models
Benjamin Adrian, Gunnar Grimnes, Jörn Hees, Matthias Sperber
Abstract
Introduction
Classification in general is the problem of deciding for a given input to which class it belongs. Usually classification can be subdivided into a learning phase (aka training phase) and a classification phase (aka test phase). (TODO: offline, online, reinforcement, ... learning)
Relational Classification
Nearest Neighbor (1-NN or NN)
Nearest Neighbor classifiers are classifiers of the most simple kind. In the training phase they simply record the class for each sample. Later in the classification phase they calculate the distances of the query to all samples in their records and return the class of the sample which is closest to the query.
$k$NN
The $k$-Nearest Neighbor classifier is a generalization of the simple NN, which does not immediately return the single "best match" sample's class, but inspects the nearest $k$ samples to the query and returns a class depending on a merging function, such as:
- most often observed class
- classes weighted by inverse distances
Naive Bayes
Naive Bayes Classificators are