Incremental communication for multilayer neural networks: error analysis
Artificial neural networks (ANNs) involve large amount of inter-node communications. To reduce the communication cost as well as time of learning process in ANNs, we have earlier proposed incremental inter-node communication method. In the incremental communication method, instead of communicating the full magnitude of the output value of a node, only the increment or decrement to its previous value is sent on a communication link. In this paper, the effects of limited precision incremental communication method on the convergence behavior and performance of multilayer neural networks are investigated. The nonlinear aspects of representing the incremental values with reduced (limited) precision on the commonly used error backpropagation training algorithm are analyzed. It is shown that the nonlinear effect of small perturbation in the input(s)/output of a node does not enforce instability. The analysis is supported by simulation studies of two problems.