This article discusses a number of reasons why the use of non-monotonic functions as activation functions can lead to a marked improvement in the performance of a neural network. Using a wide range of benchmarks we show that a multilayerfeed-forward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functions - as much as 150-500 times faster - when both types are trained with backpropagation.Learning speed also compares favorably with speeds reported using modified versions of the backpropagation algorithm. In addition, computational and generalization capacity increases.
展开▼
机译:本文讨论了许多原因,为什么使用非单调功能作为激活功能可能导致神经网络性能的显着提高。 Using a wide range of benchmarks we show that a multilayerfeed-forward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functions - as much as 150-500 times faster - when both types are trained具有BackPropagation.Learning速度也通过使用修改版本的BackPropagation算法报告的速度有利地进行比较。此外,计算和泛化容量增加。
展开▼