首页> 外文会议>WSEAS International Conferences >Non-Linear Data for Neural Networks Training and Testing
【24h】

Non-Linear Data for Neural Networks Training and Testing

机译:神经网络的非线性数据培训和测试

获取原文

摘要

Highly nonlinear data sets are important in the field of artificial neural networks. It is not feasible to design a neural network and try to classify some real world data directly with that network. N-bit parity is one of the oldest data used to train and test neural networks. The simplest is the 2-bit parity also known as the XOR classification problem. Some researchers say that N-bit parity s set though highly nonlinear it is a simple task to learn by neural networks, others were drifted to tailor special purpose neural networks to solve only the N-bit parity problem without explaining why there is such a need. Is it possible to judge the N-bit parity is a simple data due to the fact that it can be modeled by a deterministic finite accepter? Moreover, should patterns that are in the form of context free which require a pushdown automaton, or context-sensitive and recursively enumerable that require a Turing machine be harder to learn by neural networks? The aim of this paper is to focus on and propose some complex nonlinear data to be used in training and testing of neural networks. The most important in these parity data is that the developer can tune the complexity of nonlinearity through various amounts of degrees; the user can select various numbers of categories, huge number of pattern samples, and many hybrid symbols. Testing for various neural networks and their generalization and ability to classify unseen patterns can be more effective. Experimental results on the classification of prime numbers showed that neural networks can learn the classification of prime numbers.
机译:高度非线性数据集在人工神经网络领域很重要。设计神经网络是不可行的,并尝试直接与该网络分类一些现实世界数据。 n位奇偶校验是用于培训和测试神经网络的最旧数据之一。最简单的是2位奇偶校验也称为XOR分类问题。一些研究人员说,N位平价S设置了高度非线性,这是一个简单的任务,以便通过神经网络学习,其他人被逐步定制专用目的神经网络,以仅解决n位奇偶校验问题而不解释为什么存在这样的需要。是否有可能判断n位奇偶校验是一个简单的数据,因为它可以由确定性有限accepter建模的事实?此外,应该是以上下文形式的模式,这是需要推动自动装置的,或者需要所需的上下文敏感和递归令人令人市操作,以便通过神经网络更难学习?本文的目的是专注于并提出一些复杂的非线性数据,用于培训和测试神经网络。这些奇偶校验数据中最重要的是,开发人员可以通过各种数量的程度来调整非线性的复杂性;用户可以选择各种类别,大量的模式样本和许多混合符号。对各种神经网络的测试及其概括和分类无奈模式的能力可以更有效。素数分类的实验结果表明,神经网络可以学习素数的分类。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号