首页> 外文会议>Annual Conference on Learning Theory(COLT 2006); 20060622-25; Pittsburgh,PA(US) >Maximum Entropy Distribution Estimation with Generalized Regularization
【24h】

Maximum Entropy Distribution Estimation with Generalized Regularization

机译:广义正则化的最大熵分布估计

获取原文
获取原文并翻译 | 示例

摘要

We present a unified and complete account of maximum entropy distribution estimation subject to constraints represented by convex potential functions or, alternatively, by convex regularization. We provide fully general performance guarantees and an algorithm with a complete convergence proof. As special cases, we can easily derive performance guarantees for many known regularization types, including l_1, l_2, l_2~2 and l_1 + l_2~2 style regularization. Furthermore, our general approach enables us to use information about the structure of the feature space or about sample selection bias to derive entirely new regularization functions with superior guarantees. We propose an algorithm solving a large and general subclass of generalized maxent problems, including all discussed in the paper, and prove its convergence. Our approach generalizes techniques based on information geometry and Bregman divergences as well as those based more directly on compactness.
机译:我们提出了一个最大熵分布估计的统一而完整的说明,该估计受制于凸势函数或凸正则化表示的约束。我们提供全面的性能保证和具有完整收敛证明的算法。作为特殊情况,我们可以轻松得出许多已知正则化类型的性能保证,包括l_1,l_2,l_2〜2和l_1 + l_2〜2样式正则化。此外,我们的通用方法使我们能够使用有关特征空间结构或样本选择偏差的信息来推导具有卓越保证的全新正则化函数。我们提出一种算法来解决广义maxent问题的一个大而通用的子类,包括本文中讨论的所有问题,并证明其收敛性。我们的方法概括了基于信息几何和Bregman散度的技术,以及更直接基于紧凑性的技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号