On solving unconstrained optimization problem using three term conjugate gradient method / Nor Amila Sofiya Abdullah and Nurul Nadia Mohd Jalil

Conjugate Gradient method is commonly use to solve large scale unconstrained optimization problem. This is because they do not need the storage of matrices. Specifically, this project is to investigate more about three-term conjugate gradient methods. Inexact line search which is strong wolfe and mo...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلفون الرئيسيون: Abdullah, Nor Amila Sofiya, Mohd Jalil, Nurul Nadia
التنسيق: أطروحة
اللغة:English
منشور في: 2019
الموضوعات:
الوصول للمادة أونلاين:https://ir.uitm.edu.my/id/eprint/41324/1/41324.pdf
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
الوصف
الملخص:Conjugate Gradient method is commonly use to solve large scale unconstrained optimization problem. This is because they do not need the storage of matrices. Specifically, this project is to investigate more about three-term conjugate gradient methods. Inexact line search which is strong wolfe and modified parameter was use in this project. The methods that had been use in this project are Liu (2018), Norddin et. al.(2018), and modified Three-term Hestenes-Steifel (2007). These methods have been tested using several optimization test functions which are Extended Rosenbrock, Himmeblau function, Beale and White & Holst function . The result is analysed based on the number of iteration and CPU time. This expectation result from this research is to identify the best method for strong wolfe to solve large scale unconstrained optimization problems.