首页> 外文会议>4th International Conference on UbiComp 2002: Ubiquitous Computing, Sep 29-Oct 1, 2002, Goeteborg, Sweden >Mobile Reality: A PDA-Based Multimodal Framework Synchronizing a Hybrid Tracking Solution with 3D Graphics and Location-Sensitive Speech Interaction
【24h】

Mobile Reality: A PDA-Based Multimodal Framework Synchronizing a Hybrid Tracking Solution with 3D Graphics and Location-Sensitive Speech Interaction

机译:移动现实:基于PDA的多模式框架,可将混合跟踪解决方案与3D图形和位置敏感的语音交互进行同步

获取原文
获取原文并翻译 | 示例

摘要

A maintenance engineer who talks to pumps and pipes may not seem like the ideal person to entrust with keeping a factory running smoothly, but we hope that our Mobile Reality framework will enable such behavior in the future to be anything but suspicious! Described in this paper is how the Mobile Reality framework, running entirely on a Pocket PC, synchronizes a hybrid tracking solution to offer the user a seamless, location-dependent, mobile multi-modal interface. The user interface juxtaposes a three-dimensional graphical view with a context-sensitive speech dialog centered upon objects located in the immediate vicinity of the mobile user. In addition, support for collaboration enables shared VRML browsing with annotation and a full-duplex voice channel.
机译:一位与泵和管道交谈的维护工程师似乎并不是让工厂保持平稳运转的理想人选,但我们希望我们的Mobile Reality框架将来能使这种行为成为可疑!本文描述了完全在Pocket PC上运行的Mobile Reality框架如何同步混合跟踪解决方案,从而为用户提供无缝的,与位置相关的移动多模式界面。用户界面将三维图形视图与一个上下文相关的语音对话框并列放置,该对话框以位于移动用户紧邻区域内的对象为中心。另外,对协作的支持使带有注释和全双工语音通道的共享VRML浏览成为可能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号