File(s) under permanent embargo
Reason: This item is currently closed access.
Display-dependent preprocessing of depth maps based on just-noticeable depth difference modeling
journal contribution
posted on 2016-10-10, 08:59 authored by Varuna De Silva, Erhan EkmekciogluErhan Ekmekcioglu, W.A.C. Fernando, Stewart T. WorrallThis paper addresses the sensitivity of human vision to spatial depth variations in a 3-D video scene, seen on a stereoscopic display, based on an experimental derivation of a just noticeable depth difference (JNDD) model. The main target is to exploit the depth perception sensitivity of humans in suppressing the unnecessary spatial depth details, hence reducing the transmission overhead allocated to depth maps. Based on the JNDD model derived, depth map sequences are preprocessed to suppress the depth details that are not perceivable by the viewers and to minimize the rendering artefacts that arise due to optical noise, where the optical noise is triggered by the inaccuracies in the depth estimation process. Theoretical and experimental evidences are provided to illustrate that the proposed depth adaptive preprocessing filter does not alter the 3-D visual quality or the view synthesis quality for free-viewpoint video applications. Experimental results suggest that the bit rate for depth map coding can be reduced up to 78% for the depth maps captured with depth-range cameras and up to 24% for the depth maps estimated with computer vision algorithms, without affecting the 3-D visual quality or the arbitrary view synthesis quality.
History
School
- Loughborough University London
Published in
Selected Topics in Signal Processing, IEEE Journal ofVolume
5Issue
2Pages
335 - 351Citation
DE SILVA, V. ... et al., 2011. Display-dependent preprocessing of depth maps based on just-noticeable depth difference modeling. IEEE Journal of Selected Topics in Signal Processing, 5 (2), pp.335-351.Publisher
© IEEEVersion
- VoR (Version of Record)
Publisher statement
This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/Publication date
2011Notes
Closed access.ISSN
1932-4553eISSN
1941-0484Publisher version
Language
- en