This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical
pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability
density functions and the properties and merits of the multi-layer perceptron and radial basis function network
models. Also covered are various forms of error functions, principal algorithms for error function minimalization,
learning and generalization in neural networks, and Bayesian techniques and their applications. Designed as a text,
with over 100 exercises, this fully up-to-date work will benefit anyone involved in the fields of neural computation
and pattern recognition.
Table of Contents
1. Statistical pattern recognition
2. Probability density estimation
3. Single-layer networks
4. The multi-layer perceptron
5. Radial basis functions
6. Error
7. Parameter optimization algorithms
8. Pre-processing and feature extraction
9. Learning and generalization
10. Bayesian techniques