In this article we present an algorithm that permits training networks that include Gaussian type higher order synapses. This algorithm is an extension of the classical backpropagation algorithm. Higher order synapses permit carrying out tasks using simpler networks than traditionally employed. The key to this simplicity is in the structure of the synapses: a gaussian with three trainable parameters. The fact that it is a function and consequently presents a variable output depending on its inputs and that it possesses more than one trainable parameter that allows it to implement non linear processing functions on its inputs, endows the networks with a large capacity for learning and generalization. We present two examples where these capacities are shown. The first one is a target tracking module for a the visual system of a real robot and the second one is an image classification system working on real images.
展开▼