The effect of adaptive parameters on the performance of back propagation

The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches mod...

Full description

Saved in:
Bibliographic Details
Main Author: Abdul Hamid, Norhamreeza
Format: Thesis
Language:English
English
English
Published: 2012
Subjects:
Online Access:http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf
http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-uthm-ep.2344
record_format uketd_dc
spelling my-uthm-ep.23442021-10-31T06:53:49Z The effect of adaptive parameters on the performance of back propagation 2012-04 Abdul Hamid, Norhamreeza Q Science (General) Q300-390 Cybernetics The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches modify the learning model in order to add a random factor to the model, which overcomes the tendency to sink into local minima. However, the random perturbations of the search direction and various kinds of stochastic adjustment to the current set of weights are not effective in enabling a network to escape from local minima which cause the network fail to converge to a global minimum within a reasonable number of iterations. Thus, this research proposed a new method known as Back Propagation Gradient Descent with Adaptive Gain, Adaptive Momentum and Adaptive Learning Rate (BPGD-AGAMAL) which modifies the existing Back Propagation Gradient Descent algorithm by adaptively changing the gain, momentum coefficient and learning rate. In this method, each training pattern has its own activation functions of neurons in the hidden layer. The activation functions are adjusted by the adaptation of gain parameters together with adaptive momentum and learning rate value during the learning process. The efficiency of the proposed algorithm is compared with conventional Back Propagation Gradient Descent and Back Propagation Gradient Descent with Adaptive Gain by means of simulation on six benchmark problems namely breast cancer, card, glass, iris, soybean, and thyroid. The results show that the proposed algorithm extensively improves the learning process of conventional Back Propagation algorithm. 2012-04 Thesis http://eprints.uthm.edu.my/2344/ http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf text en public http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf text en staffonly http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf text en validuser mphil masters Universiti Tun Hussein Malaysia Fakulti Sains Komputer dan Teknologi Maklumat
institution Universiti Tun Hussein Onn Malaysia
collection UTHM Institutional Repository
language English
English
English
topic Q Science (General)
Q300-390 Cybernetics
spellingShingle Q Science (General)
Q300-390 Cybernetics
Abdul Hamid, Norhamreeza
The effect of adaptive parameters on the performance of back propagation
description The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches modify the learning model in order to add a random factor to the model, which overcomes the tendency to sink into local minima. However, the random perturbations of the search direction and various kinds of stochastic adjustment to the current set of weights are not effective in enabling a network to escape from local minima which cause the network fail to converge to a global minimum within a reasonable number of iterations. Thus, this research proposed a new method known as Back Propagation Gradient Descent with Adaptive Gain, Adaptive Momentum and Adaptive Learning Rate (BPGD-AGAMAL) which modifies the existing Back Propagation Gradient Descent algorithm by adaptively changing the gain, momentum coefficient and learning rate. In this method, each training pattern has its own activation functions of neurons in the hidden layer. The activation functions are adjusted by the adaptation of gain parameters together with adaptive momentum and learning rate value during the learning process. The efficiency of the proposed algorithm is compared with conventional Back Propagation Gradient Descent and Back Propagation Gradient Descent with Adaptive Gain by means of simulation on six benchmark problems namely breast cancer, card, glass, iris, soybean, and thyroid. The results show that the proposed algorithm extensively improves the learning process of conventional Back Propagation algorithm.
format Thesis
qualification_name Master of Philosophy (M.Phil.)
qualification_level Master's degree
author Abdul Hamid, Norhamreeza
author_facet Abdul Hamid, Norhamreeza
author_sort Abdul Hamid, Norhamreeza
title The effect of adaptive parameters on the performance of back propagation
title_short The effect of adaptive parameters on the performance of back propagation
title_full The effect of adaptive parameters on the performance of back propagation
title_fullStr The effect of adaptive parameters on the performance of back propagation
title_full_unstemmed The effect of adaptive parameters on the performance of back propagation
title_sort effect of adaptive parameters on the performance of back propagation
granting_institution Universiti Tun Hussein Malaysia
granting_department Fakulti Sains Komputer dan Teknologi Maklumat
publishDate 2012
url http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf
http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf
_version_ 1747830942676811776