首页>
外国专利>
Slim embedding layers for recurrent neural language models
Slim embedding layers for recurrent neural language models
展开▼
机译:用于经常性神经语言模型的纤薄嵌入层
展开▼
页面导航
摘要
著录项
相似文献
摘要
Described herein are systems and methods for compressing or otherwise reducing the memory requirements for storing and computing the model parameters in recurrent neural language models. Embodiments include space compression methodologies that share the structured parameters at the input embedding layer, the output embedding layers, or both of a recurrent neural language model to significantly reduce the size of model parameters, but still compactly represent the original input and output embedding layers. Embodiments of the methodology are easy to implement and tune. Experiments on several data sets show that embodiments achieved similar perplexity and BLEU score results while only using a fraction of the parameters.
展开▼