Loughborough University
Browse
paper20150307.pdf (2.49 MB)

Stereoscopic image quality assessment method based on binocular combination saliency model

Download (2.49 MB)
journal contribution
posted on 2016-03-17, 10:08 authored by Yun Liu, Jiachen Yang, Qinggang MengQinggang Meng, Zhihan Lu, Zhanjie Song, Zhiqun Gao
The objective quality assessment of stereoscopic images plays an important role in three-dimensional (3D) technologies. In this paper, we propose an effective method to evaluate the quality of stereoscopic images that are afflicted by symmetric distortions. The major technical contribution of this paper is that the binocular combination behaviours and human 3D visual saliency characteristics are both considered. In particular, a new 3D saliency map is developed, which not only greatly reduces the computational complexity by avoiding calculation of the depth information, but also assigns appropriate weights to the image contents. Experimental results indicate that the proposed metric not only significantly outperforms conventional 2D quality metrics, but also achieves higher performance than the existing 3D quality assessment models.

Funding

The authors would like to thank Prof. Alan C. Bovik for providing the LIVE 3D IQA Database. This research is partially supported by the National Natural Science Foundation of China (Nos. 61471260 and 61271324), and Program for New Century Excellent Talents in University (NCET-12-0400).

History

School

  • Science

Department

  • Computer Science

Published in

Signal Processing

Volume

125

Pages

237 - 248

Citation

LIU, Y. ... et al., 2016. Stereoscopic image quality assessment method based on binocular combination saliency model. Signal Processing, 125, pp.237-248.

Publisher

© Elsevier

Version

  • AM (Accepted Manuscript)

Publisher statement

This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/

Publication date

2016

Notes

This paper was accepted for publication in the journal Signal Processing and the definitive published version is available at http://dx.doi.org/10.1016/j.sigpro.2016.01.019

ISSN

0165-1684

Language

  • en