With the recent introduction of realistichuman entities into large-scale networked virtualenvironments, there exists a requirement fordynamic, real-time human motion. The future fordistributed virtual environments will includeupwards of 100,000 participants capable of participatingfrom anywhere in the world, interactingwith any entity that exists in the environment.In order to attain a high level of realism inthe virtual world, the user must be able todynamically interact with his environment. Forthe lower body, we find that scripted locomotivemotion is adequate. However, the same is nottrue for upper body motion because humans bytheir nature interact with their environmentlargely with their hands. The focus of thisresearch is the development of an interactiveinterface which will achieve dynamic upperbody motion currently not possible with scriptedsystems. We present the basics of ongoing workinvolving the representation of realistic, realtime upper body motion of a virtual human in anetworked virtual environment using magneticsensors attached to the user.
展开▼