Research Article Open Access

Conjugate Gradient ‎Method: A Developed Version to Resolve Unconstrained Optimization Problems

Ahmad Alhawarat1, Nguyen-Thoi Trung1 and Zabidin Salleh2
  • 1 Ton Duc Thang University, Vietnam
  • 2 Universiti Malaysia Terengganu, Malaysia

Abstract

One of the important methods that are widely utilized to resolve ‎unconstrained ‎optimization problems is the Conjugate Gradient (CG) method. This paper aims to propose a new version of the CG method on the basis of Weak Wolfe-Powell (WWP) line search. The assumption is bounded below optimization problems with the Lipschitz continuous gradient. The new parameter obtains global convergence properties when the WWP line search is used. The descent condition is established without using any line search. The performance of the proposed CG method is tested by obtaining some unconstrained optimization problems from the CUTEst library. Testing results show that the proposed version of the CG method outperforms CG-DESCENT version 5.3 in terms of CPU time, function evaluations, gradient evaluations and number of iterations.

Journal of Computer Science
Volume 16 No. 9, 2020, 1220-1228

DOI: https://doi.org/10.3844/jcssp.2020.1220.1228

Submitted On: 12 April 2018 Published On: 18 September 2020

How to Cite: Alhawarat, A., Trung, N. & Salleh, Z. (2020). Conjugate Gradient ‎Method: A Developed Version to Resolve Unconstrained Optimization Problems. Journal of Computer Science, 16(9), 1220-1228. https://doi.org/10.3844/jcssp.2020.1220.1228

  • 3,447 Views
  • 1,439 Downloads
  • 3 Citations

Download

Keywords

  • Unconstrained Optimization
  • Conjugate Gradient
  • Line Search
  • Convergence Analysis