Efficiency of subgradient method in solving nonsmootth optimization problems
Nonsmooth optimization is one of the hardest type of problems to solve in optimization area. This is because of the non-differentiability of the function itself. Subgradient method is generally known as the method to solve nonsmooth optimization problems, where it was first discovered by Shor [17]....
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | http://eprints.utm.my/id/eprint/38863/1/NurAziraAbdullahMFS2014.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Nonsmooth optimization is one of the hardest type of problems to solve in optimization area. This is because of the non-differentiability of the function itself. Subgradient method is generally known as the method to solve nonsmooth optimization problems, where it was first discovered by Shor [17]. The main purpose of this study is to analyze the efficiency of subgradient method by implementing it on these nonsmooth problems. This study considers two different types of problems, the first of which is Shor’s piecewise quadratic function type of problem and the other is L1-regularized form type of problems. The objectives of this study are to apply the subgradient method on nonsmooth optimization problems and to develop matlab code for the subgradient method and to compare the performance of the method using various step sizes and matrix dimensions. In order to achieve the result, we will use matlab software. At the end of this study we can conclude that subgradient method can solve nonsmooth optimization problems and the rate of convergence varies based on the step size used. |
---|