【24h】

Minsky, Chomsky and Deep Nets

机译:米斯基,乔姆斯基和深网

获取原文

摘要

When Minsky and Chomsky were at Harvard in the 1950s, they started out their careers questioning a number of machine learning methods that have since regained popularity. Minsky's Perceptrons was a reaction to neural nets and Chomsky's Syntactic Structures was a reaction to ngram language models. Many of their objections are being ignored and forgotten (perhaps for good reasons, and perhaps not). While their arguments may sound negative, I believe there is a more constructive way to think about their efforts; they were both attempting to organize computational tasks into larger frameworks such as what is now known as the Chomsky Hierarchy and algorithmic complexity. Section 5 will propose an organizing framework for deep nets. Deep nets are probably not the solution to all the world's problems. They don't do the impossible (solve the halting problem)’ and they probably aren't, great at many tasks such as sorting large vectors and multiplying large matrices. In practice, deep nets have produced extremely exciting results in vision and speech, though other tasks may be more challenging for deep nets.
机译:当Minsky和Chomsky在20世纪50年代在哈佛时,他们开始职业生涯质疑许多机器学习方法,这些方法已经恢复受欢迎。 Minsky的感觉人是对神经网络的反应,乔姆斯基的句法结构是对NARM语言模型的反应。他们的许多异议被忽略并被遗忘(也许有充分的理由,也许不是)。虽然他们的论点可能听起来消极,但我相信有更具建设性的方式来思考他们的努力;他们既试图将计算任务组织成更大的框架,例如现在称为Chomsky层次结构和算法复杂性的框架。第5节将提出深网络的组织框架。深度净可能不是全世界所有问题的解决方案。他们不做的不可能(解决停止问题)',并且他们可能不是很好的许多任务,例如对大型矢量进行排序和乘以大矩阵。在实践中,深网络在视觉和言论中产生了极大的令人兴奋的结果,尽管其他任务可能对深网络更具挑战性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号