Title |
Acceleration the Convergence and Improving the Learning Accuracy of the Back-Propagation Method |
Abstract |
In this paper, the convergence and the learning accuracy of the back-propagation (BP) method in neural network are investigated by 1) analyzing the reason for decelerating the convergence of BP method and examining the rapid deceleration of the convergence when the learning is executed on the part of sigmoid activation function with the very small first derivative and 2) proposing the modified logistic activation function by defining, the convergence factor based on the analysis. Learning on the output patterns of binary as well as analog forms are tested by the proposed method. In binary output patter, the test results show that the convergence is accelerated and the learning accuracy is improved, and the weights and thresholds are converged so that the stability of neural network can be enhanced. In analog output patter, the results show that with extensive initial transient phenomena the learning error is decreased according to the convergence factor, subsequently the learning accuracy is enhanced. |