Using a normalized Gaussian function for feedforward neural networks with a single hidden layer has been proven to have the capability of universal approximation in a satisfactory sense. Back-propagation neural networks with Gaussian function synapses have better convergence over those with linear multiplying synapses. A compact analog Gaussian synapse is presented in this paper. The standard deviation and the magnitude of the proposed Gaussian synapse can be programmed externally.
展开▼