首页> 外文会议>Uncertainty in artificial intelligence >Lipschitz Parametrization of Probabilistic Graphical Models
【24h】

Lipschitz Parametrization of Probabilistic Graphical Models

机译:Lipschitz概率图形模型的参数化

获取原文
获取原文并翻译 | 示例

摘要

We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the L_p-norm of the parameters. We discuss several implications of Lipschitz parametrization. We present an upper bound of the Kullback-Leibler di vergence that allows understanding methods that penalize the L_p-norm of differences of pa rameters as the minimization of that upper bound. The expected log-likelihood is lower bounded by the negative L_p,-norm, which al lows understanding the generalization ability of probabilistic models. The exponential of the negative L_p-norm is involved in the lower bound of the Bayes error rate, which shows that it is reasonable to use parameters as fea tures in algorithms that rely on metric spaces (e.g. classification, dimensionality reduction, clustering). Our results do not rely on spe cific algorithms for learning the structure or parameters. We show preliminary results for activity recognition and temporal segmenta tion.
机译:我们表明,相对于参数的L_p范数,几个概率图形模型的对数似然性是Lipschitz连续的。我们讨论Lipschitz参数化的几个含义。我们提出了Kullback-Leibler散度的上限,该上限允许理解将参数差的L_p-范数作为上限最小化的惩罚方法。预期的对数似然性由负L_p,-范数下界,这也使人们无法理解概率模型的泛化能力。负L_p范数的指数与贝叶斯错误率的下限有关,这表明在依赖度量空间(例如分类,降维,聚类)的算法中将参数用作特征是合理的。我们的结果不依赖特定的算法来学习结构或参数。我们显示了活动识别和时间分割的初步结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号