...
首页> 外文期刊>Foundations and trends in communications and information theory >Concentration of Measure Inequalities in Information Theory, Communications, and Coding
【24h】

Concentration of Measure Inequalities in Information Theory, Communications, and Coding

机译:信息理论,通信和编码中度量不等的集中

获取原文
           

摘要

During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.
机译:在过去的二十年中,浓度不等式已成为各个领域令人兴奋的发展主题,包括凸几何,泛函分析,统计物理学,高维统计,纯概率论和应用概率论(例如,随机图中的测量现象集中,随机矩阵和渗流),信息论,计算机理论和学习理论。本专题着重于一些主要的现代数学工具,这些工具可用于推导浓度不等式,它们与信息论的联系以及它们在通讯和编码中的各种应用。除了进行调查外,该专着还包括作者提供的各种最新成果。专论的第一部分介绍了mar的经典浓度不等式,以及最近的一些改进和扩展。 ting方法的功能和多功能性是在图形和迭代解码算法上定义的代码以及无线通信代码的背景下得到体现的。专论的第二部分介绍了熵方法,这是一种用于导出浓度不等式的信息论技术。熵方法的基本组成部分首先在对数Sobolev不等式的背景下讨论,它是所谓的度量集中的功能方法的基础,然后从基于运输成本不等式和度量概率的补充信息理论观点出发空格。简要总结了有关因变量的集中度的一些代表性结果,并着重说明了它们与熵方法的联系。最后,我们讨论了熵方法在通信和编码问题中的几种应用,包括强逆,良好信道编码的经验分布,以及度量集中的信息理论逆。

著录项

  • 来源
  • 作者

    Maxim Raginsky; Igal Sason;

  • 作者单位

    Department of Electrical and Computer Engineering Coordinated Science Laboratory University of Illinois at Urbana-Champaign Urbana, IL 61801, USA;

    Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号