首页> 外文会议>IEEE International Conference of Safe Production and Informatization >A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy
【24h】

A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy

机译:基于Radimacher复杂性和Shannon熵的AI不确定性理论

获取原文

摘要

In this paper, we present a theoretical discussion on AI deep learning neural network uncertainty investigation based on the classical Rademacher complexity and Shannon entropy. First it is shown that the classical Rademacher complexity and Shannon entropy is closely related by quantity by definitions. Secondly based on the Shannon mathematical theory on communication [3], we derive a criteria to ensure AI correctness and accuracy in classifications problems. Last but not the least based on Peter Barlette's work in [1], we show both a relaxing condition and a stricter condition to guarantee the correctness and accuracy in AI classification. By elucidating in this paper criteria condition in terms of Shannon entropy based on Shannon theory, it becomes easier to explore other criteria in terms of other complexity measurements such as Vapnik-Cheronenkis, Gaussian complexity by taking advantage of the relations studies results in other references such as in [2]. A close to 1/2 criteria on Shannon entropy is derived in this paper for the theoretical investigation of AI accuracy and correctness for classification problems.
机译:本文基于经典改造复杂性和香农熵展示了对AI深度学习神经网络不确定性调查的理论探讨。首先,表明经典的显拉莫赫复杂性和香农熵通过定义的数量密切相关。其次,基于Shannon数学理论的通信[3],我们推导了一个标准,以确保AI的正确性和准确性问题。最后但不是最不限于Peter Barlette在[1]的工作中,我们展示了一个放松的条件和更严格的条件,以保证AI分类的正确性和准确性。通过阐明基于香农理论的香农熵的条件条件,通过利用关系研究导致其他参考资料,更容易探索其他复杂性测量,例如Vapnik-Cheronenkis,如Vapnik-Cheronenkis,高斯复杂性等待其他复杂性测量如[2]所示。本文推出了Shannon Entropy的1/2标准,用于对AI准确性和校正问题的理论研究。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号