The problem addressed in this thesis is that most large-scale networked virtual environments (VE) do not possess an interface to produce dynamic, real-time interactive simulated human motion. In order to attain a high level of realism in the virtual world, the user must be able to dynamically interact with his environment. For the lower body, we find that scripted locomotive motion is adequate. However, the same is not true for upper body motion because humans by their nature interact with their environment largely with their hands. The approach taken in this thesis is to develop an interactive interface which achieves dynamic real-time upper body motion while not encumbering the user. The interface is based on inexpensive and commercially available six degree of freedom (DOF) magnetic sensor technology and fast kinematic algorithms. The result of this work is the creation of a human upper body interface which can be extended for use in any large-scale networked interactive VE, such as NPSNET. Three sensors are strapped onto each arm of the user, which read their position and orientation and transmit this information to the software, which in turn produces the same motion of the computer human icon in real-time. An interface such as this enables participants in networked VE to more naturally interact with the environment in real-time.
展开▼