Using multistandpoint panoramic browsers as dis- plays, we have developed a control function that syn- chronizes revolution and rotation of a visual perspective around a designated point of regard in a virtual environ- ment. The phase-locked orbit is uniquely determined by the focus and the start point, and the user can pa- rameterize direction, step size, and cycle speed, and in- voke an animated or single-stepped gesture. The images can be monoscopic or stereoscopic, and the rendering supports the usual scaling functions (zoom/unzoom). Additionally, via sibling clients that can directionalize realtime audio streams, spatialize hdd-resident audio files, or render rotation via a personal rotary motion platform, spatial sound and propriceptive sensations can be synchronized with such gestures, providing com- plementary multimodal displays.
展开▼