首页> 外文会议>European conference on machine learning and principles and practice of knowledge discovery in databases >Efficient Sequence Regression by Learning Linear Models in All-Subsequence Space
【24h】

Efficient Sequence Regression by Learning Linear Models in All-Subsequence Space

机译:通过学习所有子序列空间中的线性模型进行有效的序列回归

获取原文

摘要

We present a new approach for learning a sequence regression function, i.e., a mapping from sequential observations to a numeric score. Our learning algorithm employs coordinate gradient descent with Gauss-Southwell optimization in the feature space of all subsequences. We give a tight upper bound for the coordinate wise gradients of squared error loss which enables efficient Gauss-Southwell selection. The proposed bound is built by separating the positive and the negative gradients of the loss function and exploits the structure of the feature space. Extensive experiments on simulated as well as real-world sequence regression benchmarks show that the bound is effective and our proposed learning algorithm is efficient and accurate. The resulting linear regression model provides the user with a list of the most predictive features selected during the learning stage, adding to the interpretability of the method. Code and data related to this chapter are available at: https://github. com/svgsponer/SqLoss.
机译:我们提出了一种学习序列回归函数的新方法,即从顺序观察到数值分数的映射。我们的学习算法在所有子序列的特征空间中采用高斯-索斯韦尔优化的坐标梯度下降法。我们为平方误差损失的坐标方向梯度给出了一个严格的上限,这可以实现有效的高斯-绍斯韦尔选择。通过分离损失函数的正斜率和负斜率来构建建议的边界,并利用特征空间的结构。在模拟序列和实际序列回归基准上进行的大量实验表明,该边界是有效的,并且我们提出的学习算法是高效且准确的。所得的线性回归模型为用户提供了在学习阶段选择的最具预测性的功能列表,从而增加了方法的可解释性。与本章相关的代码和数据可在以下网址获得:https://github.com/。 com / svgsponer / SqLoss。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号