2.1 Multilayer Neural Networks

MLP is one of the most extensively used neural networks. Given two sets of data, input/output pairs, MLP is able to develop a specific nonlinear mapping by adjusting the network weights by using a learning algorithm. It has been demonstrated that a two-layer MLP will adequately approximate any nonlinear mapping. The most used MLP training method is the backpropagation (BP) algorithm, where a steepest descent gradient approach and a chain-rule are adopted for back-propagated error correction from the output layer. Considerable efforts have been put into improving the speed of convergence, generalization performance, and the discriminative ability of MLP. To accelerate the BP algorithm, several heuristic rules have been proposed to adjust the learning rates or modify error functions [17]. Acceleration of training MLP can also be achieved by the use of other modifications to the standard BP algorithm, such as conjugate gradient BP, recursive least-square-based BP, and the Levenberg-Marquardt algorithm. To verify the generalization ability of MLP, the independent validation method can be used by dividing the available data set into a number of sub sets for training, validation and testing [16]. To improve the discriminative capability of MLP when applied to a classification task, a discriminative MLP learning rule was proposed which is more suitable for pattern classification tasks [13].