scholarly journals A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization

Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.

Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


Author(s):  
Ladan Arman ◽  
Yuanming Xu ◽  
Long Liping

Abstract In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods are proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our methods.


2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


2014 ◽  
Vol 11 (04) ◽  
pp. 1350092 ◽  
Author(s):  
SAMAN BABAIE-KAFAKI

In an attempt to find a reasonable solution for an open problem propounded by Andrei in nonlinear conjugate gradient methods, an adaptive conjugacy condition is proposed. The suggested condition is designed based on an implicit switch from a conjugacy condition to the standard secant equation, using an extended conjugacy condition proposed by Dai and Liao. Following the approach of Dai and Liao, two adaptive nonlinear conjugate gradient methods are proposed based on the suggested adaptive conjugacy condition. An interesting feature of one of the proposed methods is the adaptive switch between the nonlinear conjugate gradient methods proposed by Hestenes and Stiefel, and Perry. Under proper conditions, it is shown that one of the proposed methods is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Numerical results demonstrating the effectiveness of the proposed adaptive approach in the sense of the performance profile introduced by Dolan and Moré are reported.


Sign in / Sign up

Export Citation Format

Share Document