The proof of the sufficient descent condition of the Wei–Yao–Liu conjugate gradient method under the strong Wolfe–Powell line search

2007 ◽  
Vol 189 (2) ◽  
pp. 1241-1245 ◽  
Author(s):  
Hai Huang ◽  
Zengxin Wei ◽  
Yao Shengwei
2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


2019 ◽  
Vol 38 (7) ◽  
pp. 227-231
Author(s):  
Huda Younus Najm ◽  
Eman T. Hamed ◽  
Huda I. Ahmed

In this study, we propose a new parameter in the conjugate gradient method. It is shown that the new method fulfils the sufficient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2019 ◽  
Vol 7 (1) ◽  
pp. 34-36
Author(s):  
Alaa L. Ibrahim ◽  
Muhammad A. Sadiq ◽  
Salah G. Shareef

This paper, proposes a new conjugate gradient method for unconstrained optimization based on Dai-Liao (DL) formula; descent condition and sufficient descent condition for our method are provided. The numerical results and comparison show that the proposed algorithm is potentially efficient when we compare with (PR) depending on number of iterations (NOI) and the number of functions evaluation (NOF).


2019 ◽  
Vol 8 (4) ◽  
pp. 11464-11467

Spectral conjugate gradient method has been used in most cases as an alternative to the conjugate gradient (CG) method in order to solve nonlinear unconstrained problems. In this paper, we introduced a spectral parameter of HS conjugate gradient method resultant from the classical CG search direction and used some of the standard test functions with numerous variables to prove its sufficient descent and global convergence properties, the numerical outcome is verified by exact line search procedures.


Sign in / Sign up

Export Citation Format

Share Document