首页> 外文会议>Annual Conference on Learning Theory(COLT 2006); 20060622-25; Pittsburgh,PA(US) >Can Entropic Regularization Be Replaced by Squared Euclidean Distance Plus Additional Linear Constraints
【24h】

Can Entropic Regularization Be Replaced by Squared Euclidean Distance Plus Additional Linear Constraints

机译:可以用平方的欧几里德距离加上附加的线性约束代替熵正则化

获取原文
获取原文并翻译 | 示例

摘要

There are two main families of on-line algorithms depending on whether a relative entropy or a squared Euclidean distance is used as a regularizer. The difference between the two families can be dramatic. The question is whether one can always achieve comparable performance by replacing the relative entropy regularization by the squared Euclidean distance plus additional linear constraints. We formulate a simple open problem along these lines for the case of learning disjunctions.
机译:根据是将相对熵还是平方欧几里德距离用作正则化函数,在线算法主要有两种。两个家庭之间的差异可能很大。问题在于,是否可以通过用平方的欧几里得距离加上附加的线性约束代替相对熵正则化来始终获得可比的性能。对于学习歧义的情况,我们按照这些思路制定了一个简单的开放问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号