File(s) under permanent embargo

Reason: This item is currently closed access.

Efficient edge, motion and depth-range adaptive processing for enhancement of multi-view depth map sequences

conference contribution
posted on 11.10.2016, 14:57 authored by Erhan EkmekciogluErhan Ekmekcioglu, Vladan Velisavljevic, Stewart T. Worrall
We present a novel and efficient multi-view depth map enhancement method proposed as a post-processing of initially estimated depth maps. The proposed method is based on edge, motion and scene depth-range adaptive median filtering and allows for an improved quality of virtual view synthesis. To enforce the spatial, temporal and inter-view coherence in the multi-view depth maps, the median filtering is applied to 4-dimensional windows that consist of the spatially neighboring depth map values taken at different viewpoints and time instants. A fast iterative block segmentation approach is adopted to adaptively shrink these windows in the presence of edges and motion for preservation of sharpness and realistic rendering and for improvement of the compression efficiency. We show that our enhancement method leads to a reduction of the coding bit-rate required for representation of the depth maps and also leads to a gain in the quality of synthesized views at arbitrary virtual viewpoints.


This work was in part developed within VISNET II, a European Network of Excellence (, funded under the European Commission IST FP6 programme.



  • Loughborough University London

Published in

Image Processing (ICIP), 2009 16th IEEE International Conference on


3537 - 3540


EKMEKCIOGLU, E., VELISAVLJEVIC, V. and WORRALL, S., 2009. Efficient edge, motion and depth-range adaptive processing for enhancement of multi-view depth map sequences. IN: Proceedings of 2009 16th IEEE International Conference on Image Processing (ICIP 2009), Cairo, Egypt, 7-12 November 2009, pp.3537-3540.




VoR (Version of Record)

Publisher statement

This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at:

Publication date



Closed access.