scholarly journals New Hybrid Conjugate Gradient Method with Global Convergence Properties under Exact Line Search

2018 ◽  
Vol 7 (3.28) ◽  
pp. 54
Author(s):  
Yasir Salih ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Abdelrhaman Abashar ◽  
Mohamad Afendee Mohamed

Conjugate Gradient (CG) method is a very useful technique for solving large-scale nonlinear optimization problems. In this paper, we propose a new formula for 12خ²k"> , which is a hybrid of PRP and WYL methods. This method possesses sufficient descent and global convergence properties when used with exact line search. Numerical results indicate that the new formula has higher efficiency compared with other classical CG methods. 

Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
U. A. M. Roslan

A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribière-Polyak (PRP) formula. As the numerator of PRP plays a vital role in numerical result and not having the jamming issue, PRP method is not globally convergent. So, for the new three-term CG method, the idea is to use the PRP numerator and combine it with any good CG formula’s denominator that performs well. The new modification of three-term CG method possesses the sufficient descent condition independent of any line search. The novelty is that by using the Wolfe Powell line search the new modification possesses global convergence properties with convex and nonconvex functions. Numerical computation with the Wolfe Powell line search by using the standard test function of optimization shows the efficiency and robustness of the new modification.


2015 ◽  
Vol 2015 ◽  
pp. 1-7 ◽  
Author(s):  
Ahmad Alhawarat ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Zabidin Salleh

Conjugate gradient (CG) method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribière-Polyak (PRP) formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP) line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, we further explained about the cases where PRP method failed with SWP line search. Furthermore, we provide numerical computations for the new hybrid CG method which is almost better than other related PRP formulas in both the number of iterations and the CPU time under some standard test functions.


Lately, many large-scale unconstrained optimization problems rely upon nonlinear conjugate gradient (CG) methods. Many areas such as engineering, and computer science have benefited because of its simplicity, fast and low memory requirements. Many modified coefficients have appeared recently, all of which to improve these methods. This paper considers an extension conjugate gradient method of PolakRibière-Polyak using exact line search to show that it holds for some properties such as sufficient descent and global convergence. A set of 113 test problems is used to evaluate the performance of the proposed method and get compared to other existing methods using the same line search.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


Author(s):  
Amira Hamdi ◽  
Badreddine Sellami ◽  
Mohammed Belloufi

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2018 ◽  
Vol 13 (03) ◽  
pp. 2050059
Author(s):  
Amina Boumediene ◽  
Rachid Benzine ◽  
Mohammed Belloufi

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.


Sign in / Sign up

Export Citation Format

Share Document