For the prospective design of human machine interfaces in an engineering process, core principles of human performance need to be formulated into a computational theory. This paper presents the results of experiments, demonstrating that even such a core task of visual perception as the detection of symmetry can be decomposed into single well defined information processing steps. To detect symmetry the visual system needs to compare lengths and distances. In classical signal detection theory this process is described on an abstract level as the comparison of two stimuli with different intensity. The proposed theory in this paper emphasizes the spatial nature of the task and describes it as a process of several attention shifts during which inter-object relations are represented as noisy values with a specific variance. By Bayesian inference the visual system is able to guide attention to locations matching a given inter-object relation encoded in a previous step. To test the theory we performed experiments measuring latencies in symmetry perception in an attention guided paradigm. The prediction of the theory and the results from the experiments are compared. The proposed theory facilitates the modeling of information processing in visual perception in a rule based framework, and therefore is suitable to analyze graphical displays in a cognitive engineering approach.
展开▼