We investigate the approximation of real-valued functions of d boolean variables by one-hidden-layer perceptron networks. We show that each function f: {0,1}(sup)d R can be approximated within an error epsilon by a network having [(2d+1)(sup)2H/epsilon(sup)2] perceptrons with any sigmoidal activation function, where H>B(sup)2 (sub)f-|f|(sup)2 and B(sub)f is a constant which depends on the Fourier transform of f. We derive a rate of approximation for f: {0,1}(sup)d [0,1] with a finite support that is only quadratical in d.
展开▼