Comprehending a sentence requires the construction of a mental representation of the situation the sentence describes. Many researchers assume that, apart from such a situational representation, there is a level of representation at which the propositional structure of the sentence is encoded. This paper presents a simple sentence comprehension model, consisting of a neural network that transforms sentences into representations of the events they describe. During training, the network develops internal representations of the sentences. An investigation of these representations reveals that they can encode propositional information without implementing propositional structure.
展开▼