首页> 外文期刊>Neural processing letters >ENAMeL: A Language for Binary Correlation Matrix Memories: Reducing the Memory Constraints of Matrix Memories
【24h】

ENAMeL: A Language for Binary Correlation Matrix Memories: Reducing the Memory Constraints of Matrix Memories

机译:ENAMeL:二进制相关矩阵内存的一种语言:减少矩阵内存的内存约束

获取原文
获取原文并翻译 | 示例
       

摘要

Despite their relative simplicity, correlation matrix memories (CMMs) are an active area of research, as they are able to be integrated into more complex architectures such as the Associative Rule Chaining Architecture (ARCA) "Austin et al. (International conference on artificial neural networks, pp 49-56, 2012)". In this architecture, CMMs are used effectively in order to reduce the time complexity of a tree search from 0(b~d)to O(d)-where b is the branching factor and d is the depth of the tree. This paper introduces the Extended Neural Associative Memory Language (ENAMeL)-a domain specific language developed to ease development of applications using CMMs. We discuss various considerations required while developing the language, and techniques used to reduce the memory requirements of CMM-based applications. Finally we show that the memory requirements of ARCA when using the ENAMeL interpreter compare favourably to our original results "Austin et al. (International conference on artificial neural networks, pp 49-56, 2012)" run in MATLAB.
机译:尽管它们相对简单,但是相关矩阵存储器(CMM)却是一个活跃的研究领域,因为它们能够集成到更复杂的体系结构中,例如“关联规则链体系结构(ARCA)“ Austin等人(国际人工神经网络会议)网络,第49-56页,2012年)”。在该架构中,有效地使用了CMM,以将树搜索的时间复杂度从0(b〜d)降低到O(d)-其中b是分支因子,d是树的深度。本文介绍了扩展神经联想记忆语言(ENAMeL)-一种特定于领域的语言,旨在简化使用CMM进行应用程序的开发。我们讨论了开发语言时需要考虑的各种因素,以及用于减少基于CMM的应用程序的内存需求的技术。最后,我们证明,使用ENAMeL解释器时,ARCA的内存需求与在MATLAB中运行的原始结果“ Austin等人(国际人工神经网络会议,第49-56页,2012年)”相比具有优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号