首页> 外文期刊>International journal of human-computer studies >Emotion detection from touch interactions during text entry on smartphones
【24h】

Emotion detection from touch interactions during text entry on smartphones

机译:从智能手机文本输入中的触摸交互中的情感检测

获取原文
获取原文并翻译 | 示例
           

摘要

There are different modes of interaction with a software keyboard on a smartphone, such as typing and swyping. Patterns of such touch interactions on a keyboard may reflect emotions of a user. Since users may switch between different touch modalities while using a keyboard, therefore, automatic detection of emotion from touch patterns must consider both modalities in combination to detect the pattern. In this paper, we focus on identifying different features of touch interactions with a smartphone keyboard that lead to a personalized model for inferring user emotion. Since distinguishing typing and swyping activity is important to record the correct features, we designed a technique to correctly identify the modality. The ground truth labels for user emotion are collected directly from the user by periodically collecting self-reports. We jointly model typing and swyping features and correlate them with user provided self-reports to build a personalized machine learning model, which detects four emotion states (happy, sad, stressed, relaxed). We combine these design choices into an Android application TouchSense and evaluate the same in a 3-week in-the-wild study involving 22 participants. Our key evaluation results and post-study participant assessment demonstrate that it is possible to predict these emotion states with an average accuracy (AUCROC) of 73% (std dev. 6%, maximum 87%) combining these two touch interactions only.
机译:智能手机上的软件键盘有不同的交互模式,例如键入和swaming。键盘上这种触摸交互的模式可能反映用户的情绪。由于用户在使用键盘时可以在不同的触摸模态之间切换,因此,从触摸模式自动检测情绪必须考虑两种模式以检测模式。在本文中,我们专注于识别与智能手机键盘的触摸交互的不同特征,导致用于推断用户情绪的个性化模型。由于区分键入和SWEMING活动对于记录正确的功能非常重要,因此我们设计了一种正确识别模态的技术。通过定期收集自我报告,直接从用户收集用户情绪的地面真理标签。我们共同模拟打字和Swaming特征,并与用户提供相关的自我报告,以构建一个个性化机器学习模型,该模型检测到四种情绪状态(快乐,悲伤,压力,放松)。我们将这些设计选择结合到Android应用程序触摸静音中,并在涉及22名参与者的3周内进行评估。我们的主要评估结果和研究后参与者评估表明,可以预测这些情绪状态,平均精度(AUCROC)为73%(STD DEV。6%,最大87%)仅结合这两个触摸相互作用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号