A hybrid BP and HSA for enhancing a multilayer perceptron learning

Training neural networks is a great significance of a difficult task in the field of supervised learning because; its performance depends on underlying training algorithm as well as the achievement of the training process. In this study, three training algorithms namely Back-Propagation algorithm, H...

Full description

Saved in:
Bibliographic Details
Main Author: Salad Nur, Abdirashid
Format: Thesis
Published: 2014
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-utm-ep.48316
record_format uketd_dc
spelling my-utm-ep.483162017-07-26T04:36:13Z A hybrid BP and HSA for enhancing a multilayer perceptron learning 2014 Salad Nur, Abdirashid LB Theory and practice of education Training neural networks is a great significance of a difficult task in the field of supervised learning because; its performance depends on underlying training algorithm as well as the achievement of the training process. In this study, three training algorithms namely Back-Propagation algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of MLP feed forward type of NNs by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA, HSA and BP. The proposed model is empirically tested and verified by using five benchmark classification problems namely Iris, Glass, Cancer, Wine and thyroid datasets on training NNs. The MSE, training time, classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed model (BPHSA) has better results in terms of convergence error and classification accuracy compared to BP and HSA and this makes the BPHSA look as promising algorithm for neural network training 2014 Thesis http://eprints.utm.my/id/eprint/48316/ masters Universiti Teknologi Malaysia, Faculty of Computing Faculty of Computing
institution Universiti Teknologi Malaysia
collection UTM Institutional Repository
topic LB Theory and practice of education
spellingShingle LB Theory and practice of education
Salad Nur, Abdirashid
A hybrid BP and HSA for enhancing a multilayer perceptron learning
description Training neural networks is a great significance of a difficult task in the field of supervised learning because; its performance depends on underlying training algorithm as well as the achievement of the training process. In this study, three training algorithms namely Back-Propagation algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of MLP feed forward type of NNs by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA, HSA and BP. The proposed model is empirically tested and verified by using five benchmark classification problems namely Iris, Glass, Cancer, Wine and thyroid datasets on training NNs. The MSE, training time, classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed model (BPHSA) has better results in terms of convergence error and classification accuracy compared to BP and HSA and this makes the BPHSA look as promising algorithm for neural network training
format Thesis
qualification_level Master's degree
author Salad Nur, Abdirashid
author_facet Salad Nur, Abdirashid
author_sort Salad Nur, Abdirashid
title A hybrid BP and HSA for enhancing a multilayer perceptron learning
title_short A hybrid BP and HSA for enhancing a multilayer perceptron learning
title_full A hybrid BP and HSA for enhancing a multilayer perceptron learning
title_fullStr A hybrid BP and HSA for enhancing a multilayer perceptron learning
title_full_unstemmed A hybrid BP and HSA for enhancing a multilayer perceptron learning
title_sort hybrid bp and hsa for enhancing a multilayer perceptron learning
granting_institution Universiti Teknologi Malaysia, Faculty of Computing
granting_department Faculty of Computing
publishDate 2014
_version_ 1747817360728784896