Incremental communication for multilayer neural networks: error analysis

dc.contributor.authorGhorbani, Ali, A.
dc.contributor.authorBhavsar, Virendrakumar, C.
dc.date.accessioned2023-03-01T18:28:58Z
dc.date.available2023-03-01T18:28:58Z
dc.date.issued1996
dc.description.abstractArtificial neural networks (ANNs) involve large amount of inter-node communications. To reduce the communication cost as well as time of learning process in ANNs, we have earlier proposed incremental inter-node communication method. In the incremental communication method, instead of communicating the full magnitude of the output value of a node, only the increment or decrement to its previous value is sent on a communication link. In this paper, the effects of limited precision incremental communication method on the convergence behavior and performance of multilayer neural networks are investigated. The nonlinear aspects of representing the incremental values with reduced (limited) precision on the commonly used error backpropagation training algorithm are analyzed. It is shown that the nonlinear effect of small perturbation in the input(s)/output of a node does not enforce instability. The analysis is supported by simulation studies of two problems.
dc.description.copyrightCopyright @ Ali A. Ghorbani and Virendrakumar C. Bhavsar, 1996.
dc.identifier.urihttps://unbscholar.lib.unb.ca/handle/1882/14889
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.subject.disciplineComputer Science
dc.titleIncremental communication for multilayer neural networks: error analysis
dc.typetechnical report

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
item.pdf
Size:
1.5 MB
Format:
Adobe Portable Document Format

Collections