A neural network model with polynomial synapses and product contacts is investigated. The model further generalizes the sigma-pi and product units models. All the coefficients and exponents of the polynomial terms and the degrees of the polynomials (the number of polynomial terms) are learned, not predetermined. The polynomial synapses together with product contacts can produce any polynomial term. Since the number of learnable parameters is learned, in this aspect the present network is much like the growth networks. Several mechanisms in the present network contribute to a better generalization performance than the growth networks, which usually exhibit poor generalization. Gradient descent algorithms for training feedforward networks with polynomial synapses and product contacts are developed. Experimental results are presented.
展开▼