【24h】

On the weight sparsity of multilayer perceptrons

机译:关于多层感知器的稀疏性

获取原文

摘要

Approximating and representing a process, a function, or a system with an adaptive parametric model constitutes a major part of current machine learning research. An important characteristic of these models is parameter sparsity, an indicator of how succintly a model can codify fundamental properties of the approximated function. This paper investigates the sparsity patterns of a multilayer perceptron netwrok trained to mount a man-on-the-middle attack on the DES symmetric cryptosystem. The notions of absolute and effective synaptic weight sparsity are introduced and their importance to network learning procedure is explained. Finally, the results from the training of the actual multilayer perceptron are outlined and discussed. In order to promote reproducible research, the MATLAB network implementation has been posted in GitHub.
机译:用自适应参数模型逼近并表示过程,功能或系统是当前机器学习研究的主要部分。这些模型的重要特征是参数稀疏性,这是模型可以如何简洁地将近似函数的基本属性进行编码的指标。本文研究了经过训练的多层感知器netwrok的稀疏模式,该网络在DES对称密码系统上发起了中间人攻击。介绍了绝对有效突触权重稀疏性的概念,并解释了它们对网络学习过程的重要性。最后,概述并讨论了来自实际多层感知器训练的结果。为了促进可重复的研究,MATLAB网络实现已发布在GitHub上。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号