首页> 外文会议>Artificial neural nets and genetic algorithms >Exploring the Relationship between Neural Network Topology and Optimal Training Set by Means of Genetic Algorithms
【24h】

Exploring the Relationship between Neural Network Topology and Optimal Training Set by Means of Genetic Algorithms

机译:利用遗传算法探索神经网络拓扑与最优训练集的关系

获取原文
获取原文并翻译 | 示例

摘要

In a previous paper I have presented the results of optimizing Neural Network (NN) topology for the task of Natural Language Processing (NLP). In that research, all NN were trained with a fixed 20precent of the total language. In this paper I present results of optimizing a set of configuration values that have proven to affect NN performance. For example, Elman has reported improved performance when the NN were trained with simple sentences first, and complex sentences later. On the other hand, Lawrence, Giles, and Fong have repoted better results when the training data was presented in a single set. Lawrence, Giles, and Fong have also studied the effect of different learning algorithms on natural language tasks. Because of the ability of Ga to search a problem space for minima without using knowledge about the problem itself, they are well suited for problems that might contain more than one possible solution. Finding different minima becomes important for real-life applications, since variables such as number of hidden nodes, number of hidden layers, number of connections, and size of training set all can affect training and response time for NN.
机译:在上一篇论文中,我介绍了针对自然语言处理(NLP)任务优化神经网络(NN)拓扑的结果。在那项研究中,所有NN都接受了全部语言固定20%的培训。在本文中,我介绍了优化一组已证明会影响NN性能的配置值的结果。例如,当Elman先训练简单句子,然后再训练复杂句子时,Elman报告了改进的性能。另一方面,劳恩斯,吉尔斯和方芳在一组训练数据中给出了更好的结果。 Lawrence,Giles和Fong还研究了不同学习算法对自然语言任务的影响。由于Ga能够在不使用问题本身知识的情况下搜索问题空间的最小值,因此它们非常适合于可能包含多个可能解决方案的问题。找到不同的最小值对于现实生活中的应用变得很重要,因为诸如隐藏节点数,隐藏层数,连接数和训练集大小之类的变量都会影响NN的训练和响应时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号