首页> 外文会议>IEEE Conference on Computer Communications >DeepTrack: Grouping RFID Tags Based on Spatio-temporal Proximity in Retail Spaces
【24h】

DeepTrack: Grouping RFID Tags Based on Spatio-temporal Proximity in Retail Spaces

机译:DeepTrack:根据零售空间中的时空接近度对RFID标签进行分组

获取原文

摘要

RFID applications for taking inventory and processing transactions in point-of-sale (POS) systems improve operational efficiency but are not designed to provide insights about customers’ interactions with products. We bridge this gap by solving the proximity grouping problem to identify groups of RFID tags that stay in close proximity to each other over time. We design DeepTrack, a framework that uses deep learning to automatically track the group of items carried by a customer during her shopping journey. This unearths hidden purchase behaviors helping retailers make better business decisions and paves the way for innovative shopping experiences such as seamless checkout (‘a la Amazon Go). DeepTrack employs a recurrent neural network (RNN) with the attention mechanism, to solve the proximity grouping problem in noisy settings without explicitly localizing tags. We tailor DeepTrack’s design to track not only mobile groups (products carried by customers) but also flexibly identify stationary tag groups (products on shelves). The key attribute of DeepTrack is that it only uses readily available tag data from commercial off-the-shelf RFID equipment. Our experiments demonstrate that, with only two hours training data, DeepTrack achieves a grouping accuracy of 98.18% (99.79%) when tracking eight mobile (stationary) groups.
机译:用于在销售点(POS)系统中进行库存和处理交易的RFID应用程序可以提高运营效率,但其目的并不是提供有关客户与产品交互的见解。我们通过解决邻近性分组问题来识别出随时间推移彼此保持紧密接近的RFID标签组,从而弥合了这一差距。我们设计了DeepTrack,该框架使用深度学习来自动跟踪客户在购物过程中携带的物品组。这揭示了隐藏的购买行为,可帮助零售商做出更好的业务决策,并为创新的购物体验(例如无缝结帐('a la Amazon Go))铺平道路。 DeepTrack采用具有注意力机制的递归神经网络(RNN),可以解决嘈杂环境中的接近度分组问题,而无需明确定位标签。我们调整DeepTrack的设计,使其不仅可以跟踪移动组(客户携带的产品),还可以灵活地识别固定标签组(货架上的产品)。 DeepTrack的关键特性是它仅使用来自商用现货RFID设备的随时可用的标签数据。我们的实验表明,只有两个小时的训练数据,当跟踪八个移动(固定)组时,DeepTrack的分组准确性达到98.18%(99.79%)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号