首页> 外文会议>International Conference on Computer Engineering, Network and Intelligent Multimedia >Training Strategies for Remo Dance on Long Short-Term Memory Generative Model
【24h】

Training Strategies for Remo Dance on Long Short-Term Memory Generative Model

机译:基于长短期记忆生成模型的雷莫舞训练策略

获取原文

摘要

Long Short-Term Memory is a recurrent neural network that can be trained to remember long sequences of data and acts as a generative model. As a generative model, Long Short-Term Memory has the ability to reproduce the trained sequences for arbitrary length. We train Long Short-Term Memory with sequential motion data of Remo Dance, a traditional dance from East Java. Motion data is acquired from a motion capture system from real dancer as sequences of bone rotation. Training sequential data on Long Short-Term Memory is time consuming even by using current GPU technology. We found that applying feature scaling and how data are grouped to be trained together are useful strategies to achieve optimal training. Our experiments show that the scale factor in feature scaling depends on how many sequences are trained together. Single sequence trainings need value range from -8 to +8. Multiple sequences need a lower value range accordingly. We also found that sequences with small variances can be trained better when combined with large variances sequences. Trained Long Short-Term Memory is able to reproduce the dance moves with some variations.
机译:长短期记忆是一种循环神经网络,可以训练它记住长数据序列,并作为生成模型。作为一个生成模型,长短期记忆能够复制任意长度的训练序列。我们使用来自东爪哇省的传统舞蹈Remo Dance的顺序运动数据训练Long Short-Term Memory。从运动捕捉器系统获取来自真实舞者的运动数据作为骨骼旋转序列。即使使用当前的GPU技术,在Long Short-Term Memory上训练顺序数据也很耗时。我们发现应用特征缩放以及如何将数据分组在一起一起训练是实现最佳训练的有用策略。我们的实验表明,特征缩放的比例因子取决于一起训练多少序列。单序列训练的取值范围是-8至+8。因此,多个序列需要一个较低的值范围。我们还发现,与大方差序列结合使用时,具有较小方差的序列可以得到更好的训练。经过训练的长期短期记忆能够以一些变化形式再现舞蹈动作。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号