On solving unconstrained optimization problem using three term conjugate gradient method / Nor Amila Sofiya Abdullah and Nurul Nadia Mohd Jalil

Conjugate Gradient method is commonly use to solve large scale unconstrained optimization problem. This is because they do not need the storage of matrices. Specifically, this project is to investigate more about three-term conjugate gradient methods. Inexact line search which is strong wolfe and mo...

全面介绍

Saved in:
书目详细资料
Main Authors: Abdullah, Nor Amila Sofiya, Mohd Jalil, Nurul Nadia
格式: Thesis
语言:English
出版: 2019
主题:
在线阅读:https://ir.uitm.edu.my/id/eprint/41324/1/41324.pdf
标签: 添加标签
没有标签, 成为第一个标记此记录!
实物特征
总结:Conjugate Gradient method is commonly use to solve large scale unconstrained optimization problem. This is because they do not need the storage of matrices. Specifically, this project is to investigate more about three-term conjugate gradient methods. Inexact line search which is strong wolfe and modified parameter was use in this project. The methods that had been use in this project are Liu (2018), Norddin et. al.(2018), and modified Three-term Hestenes-Steifel (2007). These methods have been tested using several optimization test functions which are Extended Rosenbrock, Himmeblau function, Beale and White & Holst function . The result is analysed based on the number of iteration and CPU time. This expectation result from this research is to identify the best method for strong wolfe to solve large scale unconstrained optimization problems.