首页> 外文会议>IEEE International Conference on Multimedia and Expo >Spatial Attention-Based Non-Reference Perceptual Quality Prediction Network for Omnidirectional Images
【24h】

Spatial Attention-Based Non-Reference Perceptual Quality Prediction Network for Omnidirectional Images

机译:基于空间关注的非引用信息的非引用信息质量预测网络

获取原文

摘要

Due to the strong correlation between visual attention and perceptual quality, many methods attempt to use human saliency information for image quality assessment. Although this mechanism can get good performance, the networks require human saliency labels, which is not easily accessible for omnidirectional images (ODI). To alleviate this issue, we propose a spatial attention-based perceptual quality prediction network for non-reference quality assessment on ODIs (SAP-net). Without any human saliency labels, our network can adaptively estimate human perceptual quality on impaired ODIs through a self-attention manner, which significantly promotes the prediction performance of quality scores. Moreover, our method greatly reduces the computational complexity in quality assessment task on ODIs. Extensive experiments validate that our network outperforms 9 state-of-the-art methods for quality assessment on ODIs. The dataset and code have been available on https://github.com/yanglixiaoshen/SAP-Net.
机译:由于视觉关注和感知质量之间的相关性强烈相关,许多方法试图使用人类的效力信息来进行图像质量评估。虽然这种机制可以获得良好的性能,但是网络需要人类的效力标签,这对于全向图像(ODI)不容易访问。为了减轻这个问题,我们提出了一种基于空间关注的感知质量预测网络,用于ODIS(SAP-Net)的非参考质量评估。如果没有任何人力效力标签,我们的网络可以通过自我注意的方式自适应地估计人类感知质量,这显着促进了质量评分的预测性能。此外,我们的方法大大降低了ODIS质量评估任务中的计算复杂性。广泛的实验验证了我们的网络优于9个最先进的方法对ODIS质量评估。数据集和代码已在https://github.com/yanglixiaoshen/sap-net上提供。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号