首页> 美国政府科技报告 >Recursively Generated Networks and Dynamical Learning
【24h】

Recursively Generated Networks and Dynamical Learning

机译:递归生成网络和动态学习

获取原文

摘要

Much of the research has been based on the premise is that mathematical methodsand notation associated with constrained optimization should be used to specify a neural net, which can then be compiled to diverse implementations. But where do they get such a compiler. And what are the details of this mathematical notation. They have made substantial progress on these research questions: (1) They have developed mathematical methods that can transform one algebraic NN description into another, more implementable one. These developments were attained by serious work in the applied mathematics of neural nets. They can form the basis of a neural compiler because they address most of the major NN compilation and implementation issues. But they do not yet suffice. (2) They have been accumulating the research in a neural simulator. It can be expanded into a semi-automatic compiler: a neural net design and implementation environment based on mathematical methods. (3) They have developed a mathematical notation (not yet a formal language) for describing complex problem domains in terms of constrained optimization problems. The optimization problems can be solved by neural nets.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号