The increasingly popular radial basis function (RBF) networks are smoothed piecewise-constant universal approximators. The (smoothed) piecewise-constant property, however, limits their effectiveness in extrapolations and in "trend" learning. This paper extends the RBF network model, in a natural manner, to be smoothed piecewise-linear approximators, referred to as the extended radial basis function (ERBF) networks. This extension is significant in (at least) the following respects: (1) it can function as a global nonlinear model to smoothly link together the various local linear models; (2) it extends the RBFs ability to extrapolate and generalize more meaningfully; (3) it serves as a unifying model that brings together the various approximators including splines and CMAC neural network models, and (4) this ERBF extension, makes possible the applications of statistical modeling and experiment design techniques to the study of general neural network approximation models. Simulations results of learning various response surfaces are included for discussion and comparison.
展开▼