scholarly journals A New Conjugate Gradient Method with Sufficient Descent Property

Author(s):  
O.B. Akinduko

In this paper, by linearly combining the numerator and denominator terms of the Dai-Liao (DL) and Bamigbola-Ali-Nwaeze (BAN) conjugate gradient methods (CGMs), a general form of DL-BAN method has been proposed. From this general form, a new hybrid CGM, which was found to possess a sufficient descent property is generated. Numerical experiment was carried out on the new CGM in comparison with four existing CGMs, using some set of large scale unconstrained optimization problems. The result showed a superior performance of new method over majority of the existing methods.

2021 ◽  
Vol 6 (10) ◽  
pp. 10742-10764
Author(s):  
Ibtisam A. Masmali ◽  
◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
◽  
...  

<abstract> <p>The conjugate gradient (CG) method is a method to solve unconstrained optimization problems. Moreover CG method can be applied in medical science, industry, neural network, and many others. In this paper a new three term CG method is proposed. The new CG formula is constructed based on DL and WYL CG formulas to be non-negative and inherits the properties of HS formula. The new modification satisfies the convergence properties and the sufficient descent property. The numerical results show that the new modification is more efficient than DL, WYL, and CG-Descent formulas. We use more than 200 functions from CUTEst library to compare the results between these methods in term of number of iterations, function evaluations, gradient evaluations, and CPU time.</p> </abstract>


2014 ◽  
Vol 2014 ◽  
pp. 1-7
Author(s):  
Min Sun ◽  
Jing Liu

Recently, Zhang et al. proposed a sufficient descent Polak-Ribière-Polyak (SDPRP) conjugate gradient method for large-scale unconstrained optimization problems and proved its global convergence in the sense thatlim infk→∞∥∇f(xk)∥=0when an Armijo-type line search is used. In this paper, motivated by the line searches proposed by Shi et al. and Zhang et al., we propose two new Armijo-type line searches and show that the SDPRP method has strong convergence in the sense thatlimk→∞∥∇f(xk)∥=0under the two new line searches. Numerical results are reported to show the efficiency of the SDPRP with the new Armijo-type line searches in practical computation.


2013 ◽  
Vol 2013 ◽  
pp. 1-6
Author(s):  
Can Li

We are concerned with the nonnegative constraints optimization problems. It is well known that the conjugate gradient methods are efficient methods for solving large-scale unconstrained optimization problems due to their simplicity and low storage. Combining the modified Polak-Ribière-Polyak method proposed by Zhang, Zhou, and Li with the Zoutendijk feasible direction method, we proposed a conjugate gradient type method for solving the nonnegative constraints optimization problems. If the current iteration is a feasible point, the direction generated by the proposed method is always a feasible descent direction at the current iteration. Under appropriate conditions, we show that the proposed method is globally convergent. We also present some numerical results to show the efficiency of the proposed method.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Mohd Asrul Hery Ibrahim ◽  
Mustafa Mamat ◽  
Wah June Leong

In solving large scale problems, the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Hence, a new hybrid method, known as the BFGS-CG method, has been created based on these properties, combining the search direction between conjugate gradient methods and quasi-Newton methods. In comparison to standard BFGS methods and conjugate gradient methods, the BFGS-CG method shows significant improvement in the total number of iterations and CPU time required to solve large scale unconstrained optimization problems. We also prove that the hybrid method is globally convergent.


Lately, many large-scale unconstrained optimization problems rely upon nonlinear conjugate gradient (CG) methods. Many areas such as engineering, and computer science have benefited because of its simplicity, fast and low memory requirements. Many modified coefficients have appeared recently, all of which to improve these methods. This paper considers an extension conjugate gradient method of PolakRibière-Polyak using exact line search to show that it holds for some properties such as sufficient descent and global convergence. A set of 113 test problems is used to evaluate the performance of the proposed method and get compared to other existing methods using the same line search.


Author(s):  
Hawraz N. Jabbar ◽  
Basim A. Hassan

<p>The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.</p>


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


Sign in / Sign up

Export Citation Format

Share Document