Recurrent neural entworks are increasingly being used to model some aspect of human sequence processing, however, amy of the statistical properties that are relevant for psychological modls have not been well specified. In this work I will take a geometric view toward understanding how the probabilities in a training set are related to gang effects in a Simple recurrent network (SRN), as well as how the network learns to build tranjectories in idden unit phase space in a prediction task. In particular, the amount of overlap shared by mappings influence the ability of the network to generalize to novel items of the class. And the transition from bigram to trigam predictions provides qualitative insight of where the network trajectories will end up inhidden unit spec, and it gives a heuristic of relative complexity for short term sequence depenedencies. This work is a first step towards a better understanding of the reationship between statistical properties of a training set and the dynamics of an SRN in psychological modeling.
展开▼