The investigation of gradient method namely Steepest Descent and extending of Barzilai Borwein for solving unconstrained optimization problem / Nur Intan Syahirah Ismail & Nur Atikah Aziz

Steepest Descent is one of the pioneers method in solving optimization problem since it is globally convergence. Even though it is globally convergence, the convergence rate is still slow. Thus, in this project we focusing more on SD method and its modification, to get the better convergence rate. T...

Full description

Saved in:
Bibliographic Details
Main Authors: Ismail, Nur Intan Syahirah, Aziz, Nur Atikah
Format: Thesis
Language:English
Published: 2019
Subjects:
Online Access:https://ir.uitm.edu.my/id/eprint/38918/1/38918.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Steepest Descent is one of the pioneers method in solving optimization problem since it is globally convergence. Even though it is globally convergence, the convergence rate is still slow. Thus, in this project we focusing more on SD method and its modification, to get the better convergence rate. This research is to investigate the behavior of gradient method namely Steepest Descent (SD), Barzilai Borwein 1(BB1), Barzilai Borwein 2(BB2) and Jaafar Mohamed (JM). This project is to analyse the performance of this four-method based on CPU time and number of iterations, and to show their global convergence. Eight test functions with several initial points from four geometrical quadrants has been chosen to test on SD, BB1, BB2 and JM method. The collected data is determined based on the number of iteration and CPU time by using exact line search. The results also show that all the methods possess global convergence. In order to analyse the best method geometrically, the performance profile by Dolan and Moore is used. Based on this performance profile, the best method will be determined. Lastly, numerical result will be generated from this research as a numerical prove to all the behaviour of the methods listed above.