In this paper, we describe a random neural network model (RNNM) with positive and negative neurons which can efficiently behave like an auto-associative memory iwth two layers. We use the Hebb rule to compute the connection weights. To exploit the RNNM. it is necessary to know the values of the positive flow and negative flow rates entering each neuron from the outside of the network. We resent here a new learning algorithm for choosing these parameters which ensure good performances for pattern recognition and pattern reconstruction.
展开▼