首页> 外文会议>Foundations of software science and computational structures >On the Foundations of Quantitative Information Flow
【24h】

On the Foundations of Quantitative Information Flow

机译:定量信息流的基础

获取原文
获取原文并翻译 | 示例

摘要

There is growing interest in quantitative theories of information flow in a variety of contexts, such as secure information flow, anonymity protocols, and side-channel analysis. Such theories offer an attractive way to relax the standard noninterference properties, letting us tolerate "small" leaks that are necessary in practice. The emerging consensus is that quantitative information flow should be founded on the concepts of Shannon entropy and mutual information. But a useful theory of quantitative information flow must provide appropriate security guarantees: if the theory says that an attack leaks x bits of secret information, then x should be useful in calculating bounds on the resulting threat. In this paper, we focus on the threat that an attack will allow the secret to be guessed correctly in one try. With respect to this threat model, we argue that the consensus definitions actually fail to give good security guarantees-the problem is that a random variable can have arbitrarily large Shannon entropy even if it is highly vulnerable to being guessed. We then explore an alternative foundation based on a concept of vulnerability (closely related to Bayes risk) and which measures uncertainty using Renyi's min-entropy, rather than Shannon entropy.
机译:在各种情况下,例如安全信息流,匿名协议和边信道分析,人们对信息流的定量理论越来越感兴趣。这样的理论为放松标准的无干扰特性提供了一种有吸引力的方法,使我们能够容忍实践中必要的“小”泄漏。新兴的共识是,定量信息流应建立在香农熵和互信息的概念上。但是,有用的定量信息流理论必须提供适当的安全保证:如果该理论说攻击泄漏了x位秘密信息,那么x在计算最终威胁的边界时应该是有用的。在本文中,我们着眼于一种攻击,即攻击将使一次尝试就可以正确猜出机密的威胁。关于这种威胁模型,我们认为共识定义实际上无法提供良好的安全保证-问题在于,即使随机变量非常容易被猜测,它也可以具有任意大的Shannon熵。然后,我们基于脆弱性(与贝叶斯风险密切相关)的概念探索一种替代性基础,该基础使用Renyi的最小熵而不是Shannon熵来度量不确定性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号