A hybrid BP and HSA for enhancing a multilayer perceptron learning

Training neural networks is a great significance of a difficult task in the field of supervised learning because; its performance depends on underlying training algorithm as well as the achievement of the training process. In this study, three training algorithms namely Back-Propagation algorithm, H...

Full description

Saved in:
Bibliographic Details
Main Author: Salad Nur, Abdirashid
Format: Thesis
Language:English
Published: 2014
Subjects:
Online Access:http://eprints.utm.my/id/eprint/47957/25/AbdirashidSaladNurFC2014.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Training neural networks is a great significance of a difficult task in the field of supervised learning because; its performance depends on underlying training algorithm as well as the achievement of the training process. In this study, three training algorithms namely Back-Propagation algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of MLP feed forward type of NNs by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA, HSA and BP. The proposed model is empirically tested and verified by using five benchmark classification problems namely Iris, Glass, Cancer, Wine and thyroid datasets on training NNs. The MSE, training time, classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed model (BPHSA) has better results in terms of convergence error and classification accuracy compared to BP and HSA and this makes the BPHSA look as promising algorithm for neural network training.