首页> 外文期刊>Journal of Language Modelling >How to keep the HG weights non-negative: the truncated Perceptron reweighing rule
【24h】

How to keep the HG weights non-negative: the truncated Perceptron reweighing rule

机译:如何保持HG重量为非负值:截断的Perceptron重称规则

获取原文
       

摘要

The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant.
机译:谐波语法(HG)中关于错误驱动学习的文献采用了感知器重称规则。但是,此规则不适合HG,因为它无法确保非负的权重。因此考虑一种变体,其将更新截断为零,并使权重保持为非负数。原始Perceptron的收敛性保证和误差范围已显示出其截短的变体。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号