Research Article Open Access

A Modified Conjugate Gradient Formula for Back Propagation Neural Network Algorithm

Abbas Y. Al Bayati, Najmaddin A. Sulaiman and Gulnar W. Sadiq

Abstract

Problem statement: The Conjugate Gradient (CG) algorithm which usually used for solving nonlinear functions is presented and is combined with the modified Back Propagation (BP) algorithm yielding a new fast training multilayer algorithm. Approach: This study consisted of determination of new search directions by exploiting the information calculated by gradient descent as well as the previous search direction. The proposed algorithm improved the training efficiency of BP algorithm by adaptively modifying the initial search direction. Results: Performance of the proposed algorithm was demonstrated by comparing it with the Neural Network (NN) algorithm for the chosen test functions. Conclusion: The numerical results showed that number of iterations required by the proposed algorithm to converge was less than the both standard CG and NN algorithms. The proposed algorithm improved the training efficiency of BP-NN algorithms by adaptively modifying the initial search direction.

Journal of Computer Science
Volume 5 No. 11, 2009, 849-856

DOI: https://doi.org/10.3844/jcssp.2009.849.856

Submitted On: 24 August 2009 Published On: 30 December 2009

How to Cite: Al Bayati, A. Y., Sulaiman, N. A. & Sadiq, G. W. (2009). A Modified Conjugate Gradient Formula for Back Propagation Neural Network Algorithm. Journal of Computer Science, 5(11), 849-856. https://doi.org/10.3844/jcssp.2009.849.856

  • 3,498 Views
  • 3,911 Downloads
  • 10 Citations

Download

Keywords

  • Back-propagation algorithm
  • conjugate gradient algorithm
  • search directions
  • neural network algorithm