【24h】

Head-up Interaction: Can we break our addiction to the screen and keyboard?

机译:平视互动:我们可以打破对屏幕和键盘的沉迷吗?

获取原文

摘要

Mobile user interfaces are commonly based on techniques developed for desktop computers in the 1970s, often including buttons, sliders, windows and progress bars. These can be hard to use on the move, which then limits the way we use our devices and the applications on them. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a 'head up' way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things.Brewster will discuss some of the work being done on input using gestures done with fingers, wrist and head, along with work on output using non-speech audio, 3D sound and tactile displays in applications such as for mobile devices such as text entry, camera phone user interfaces and navigation. He will also discuss some of the issues of social acceptability of these new interfaces.
机译:移动用户界面通常基于1970年代为台式计算机开发的技术,通常包括按钮,滑块,窗口和进度条。这些可能很难在移动中使用,从而限制了我们在设备上使用我们的设备和应用程序的方式。本演讲将探讨从此类交互转向更适合移动设备的交互的可能性,以及它们在用户需要能够看到自己要去的地方,提着购物袋并抱住孩子的动态使用环境下的可能性。多模式(手势,音频和触觉)交互为我们提供了新的方式来使用我们的设备,这些设备可以免于眼睛和手的伤害,并允许用户以“抬头”的方式进行交互。这些新的交互作用将促进更适合我们日常生活的新服务,应用程序和设备,并使我们能够做很多新事物。 Brewster将讨论使用手指,手腕和头部完成的手势在输入上所做的一些工作,以及使用非语音音频,3D声音和触觉显示在输出中的工作,例如在移动设备(例如文本输入,相机)中的使用。手机用户界面和导航。他还将讨论这些新界面在社会上的可接受性问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号