...
首页> 外文期刊>IEEE Transactions on Knowledge and Data Engineering >Network Representation Lightening From Hashing to Quantization
【24h】

Network Representation Lightening From Hashing to Quantization

机译:Network Representation Lightening From Hashing to Quantization

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Information network embedding is an important way to enable efficient graph analytics. However, it still faces with computational challenges in problems such as link prediction and node recommendation, particularly with the increasing scale of networks. Both hashing and quantization are promising approaches for accelerating these problems by orders of magnitude. In the preliminary work, we have proposed to learn binary codes for information networks, but graph analytics may suffer from large accuracy degradation. To reduce information loss while achieving memory and search efficiency, we further propose to learn quantized codes for information networks. In particular, each node is represented by compositing multiple latent vectors, each of which is optimally selected from a distinct set. Since (generalized) matrix factorization unifies several well-known embedding methods with high-order proximity preserved, we propose a Network Representation Lightening framework based on Matrix Factorization (NRL-MF) to learn binary and quantized codes. We also propose an alternating optimization algorithm for efficient parameter learning, even for the generalized matrix factorization case. We finally evaluate NRL-MF on four real-world information network datasets with respect to the tasks of node classification and node recommendation. The results show that NRL-MF significantly outperforms competing baselines in both tasks, and that quantized representations indeed incur much smaller information loss than binarized codes.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号