首页> 外文会议>IEEE International Conference on Robotics and Automation >Drifting Gaussian processes with varying neighborhood sizes for online model learning
【24h】

Drifting Gaussian processes with varying neighborhood sizes for online model learning

机译:在线模型学习中具有不同邻域大小的漂移高斯过程

获取原文
获取外文期刊封面目录资料

摘要

Computationally efficient online learning of non-stationary models remains a difficult challenge. A robust and reliable algorithm could have great impact on problems in learning control. Recent work on combining the worlds of computationally efficient and locally adaptive learning algorithms with robust learning frameworks such as Gaussian process regression has taken a step towards both robust and real-time capable learning systems. However, online learning of model parameters on streaming data - that is strongly correlated, such as data arriving along a trajectory - can still create serious issues for many learning systems. Here we investigate the idea of drifting Gaussian processes which explicitly exploit the fact that data is generated along trajectories. A drifting Gaussian process keeps a history of a constant number of recently observed data points and updates its hyper-parameters at each time step. Instead of optimizing the neighborhood size on which the GP is trained on, we propose to use several - in parallel - drifting GPs whose predictions are combined for query points. We illustrate our approach on synthetically generated data and successfully evaluate on inverse dynamics learning tasks.
机译:在计算上有效的非静止模型的在线学习仍然是一个艰难的挑战。稳健且可靠的算法可能会对学习控制中的问题产生很大影响。最近的工作与诸如高斯过程回归等强大的学习框架相结合的计算上有效和本地自适应学习算法的工作已经朝着强大和实时的学习系统迈出了一步。但是,在线学习流媒体数据的模型参数 - 这是强烈相关的,例如沿轨迹到达的数据 - 可以为许多学习系统创造严重的问题。在这里,我们调查漂移的高斯进程的想法,该过程明确地利用了数据沿着轨迹生成的事实。漂移的高斯进程在每次步骤中,保持最近观察到的数据点的恒定数量的历史,并更新其超参数。我们建议使用多个 - 漂移 - 漂移的GP,而不是优化GP培训的邻域大小,而是使用其预测对查询点组合的多个 - 漂移的GPS。我们在综合生成的数据上说明了我们的方法,并成功地评估了逆动力学学习任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号