= Tutorial About Probabilistic Learning Models = Benjamin Adrian, Gunnar Grimnes, Jörn Hees, Matthias Sperber == Abstract == == Introduction == Classification in general is the problem of deciding for a given input to which class it belongs. Usually classification can be subdivided into a learning phase (aka training phase) and a classification phase (aka test phase). (TODO: offline, online, reinforcement, ... learning) === Relational Classification === ==== Nearest Neighbor (1-NN or NN) ==== Nearest Neighbor classifiers are classifiers of the most simple kind. In the training phase they simply record the class for each sample. Later in the classification phase they calculate the distances of the query to all samples in their records and return the class of the sample which is closest to the query. ===== $k$NN ===== The $k$-Nearest Neighbor classifier is a generalization of the simple NN, which does not immediately return the single "best match" sample's class, but inspects the nearest $k$ samples to the query and returns a class depending on a merging function, such as: * most often observed class * classes weighted by inverse distances ==== Naive Bayes ==== Naive Bayes Classificators are {{{ #!latex \begin{equation} classify(f_1,\dots,f_n) = argmax_c \ p(C=c) \prod_{i=1}^n p(F_i=f_i\vert C=c). \end{equation} }}} ==== Maximum Entropy ==== ==== Multi Layer Perceptrons ==== ==== Support Vector Machines ==== === Sequential Classification === ==== Hidden Markov Model ==== ==== Conditional Random Field ==== == Appendix == === Mathematical Foundations === ==== Bayes Rule ==== {{{ #!latex $P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}$ }}}