IEEE Transactions on Neural Networks

A general regression neural network

作者:
D. F. Specht

关键词:
backpropagationfunction approximationmathematics computingneural netsrandom noisebackpropagationfunction approximationgeneral regression neural networksrandom noiseBackpropagation

摘要:
A memory-based network that provides estimates of continuous variables and converges to the underlying (linear or nonlinear) regression surface is described. The general regression neural network (GRNN) is a one-pass learning algorithm with a highly parallel structure. It is shown that, even with sparse data in a multidimensional measurement space, the algorithm provides smooth transitions from one observed value to another. The algorithmic form can be used for any regression problem in which an assumption of linearity is not justified.

在线下载

相关文章:
在线客服:
对外合作:
联系方式:400-6379-560
投诉建议:feedback@hanspub.org
客服号

人工客服,优惠资讯,稿件咨询
公众号

科技前沿与学术知识分享