作者:
D. F. Specht
关键词:
backpropagation;function approximation;mathematics computing;neural nets;random noise;backpropagation;function approximation;general regression neural networks;random noise;Backpropagation
摘要:
A memory-based network that provides estimates of continuous variables and converges to the underlying (linear or nonlinear) regression surface is described. The general regression neural network (GRNN) is a one-pass learning algorithm with a highly parallel structure. It is shown that, even with sparse data in a multidimensional measurement space, the algorithm provides smooth transitions from one observed value to another. The algorithmic form can be used for any regression problem in which an assumption of linearity is not justified.
在线下载