Introduction and Bayesian Decision Theory– Pattern recognition systems – the design cycle – learning and adaptation – Bayesian decision theory – continuous features – Minimum error rate classification – discriminant functions and decision surfaces – the normal density based discriminant functions.
Maximum likelihood estimation – Bayesian estimation – Bayesian parameter estimation – Gaussian case and general theory – problems of dimensionality – components analysis and discriminants – hidden Markov models.
Nonparametric techniques and linear discriminant functions- density estimation – Parzen windows – nearest neighbourhood estimation – rules and metrics – linear discriminant functions and decision surfaces – generalized linear discriminant functions – two-category linearly separable case – minimizing the perception criterion function.
Nonmetric methods and algorithm-independent machine learning- decision trees – CARTmethods – algorithm-independent machine learning – lack of inherent superiority of any classifier – bias and variance for regression and classification – resampling or estimating statistics – estimating and comparing classifiers.
Unsupervised learning and clustering – mixture densities and identifiability – maximum likelihood estimates – application to normal mixtures – unsupervised Bayesian learning – data description and clustering – criterion functions for clustering – hierarchical clustering – component analysis – low-dimensional representations and multi-dimensional scaling.