Saravi_VISAPP_2012_140.pdf (203.76 kB)
Download file

Contourlet based multi-exposure image fusion with compensation for multi-dimensional camera shake

Download (203.76 kB)
journal contribution
posted on 28.06.2016, 14:56 by Sara Saravi, Eran Edirisinghe
Multi-exposure image fusion algorithms are used for enhancing the perceptual quality of an image captured by sensors of limited dynamic range by rendering multiple images captured at different exposure settings. One practical problem overlooked by existing algorithms is the compensation required for image deregistration due to possible multi-dimensional camera shake that results within the time gap of capturing the multiple exposure images. In our approach RANdom SAmple Consensus (RANSAC) algorithm is used to identify inliers of key-points identified by the Scale Invariant Feature Transform (SIFT) approach subsequently to the use of Coherent Point Drift (CPD) algorithm to register the images based on the selected set of key points. We provide experimental results on set of images with multi-dimensional (translational and rotational) to prove the proposed algorithm's capability to register and fuse multiple exposure images taken in the presence of camera shake providing subjectively enhanced output images.

History

School

  • Mechanical, Electrical and Manufacturing Engineering

Published in

VISAPP 2012 - Proceedings of the International Conference on Computer Vision Theory and Applications

Volume

1

Pages

182 - 185

Citation

SARAVI, S. and EDIRISINGHE, E.A., 2012. Contourlet based multi-exposure image fusion with compensation for multi-dimensional camera shake. IN: Csurka, G. and Braz, J. (EDS.) Proceeding of the the International Conference on Computer Vision Theory and Applications, (VISAPP 2012), 1, pp. 182-185.

Publisher

© SCITEPRESS

Version

AM (Accepted Manuscript)

Publisher statement

This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/

Publication date

2012

Notes

This is a conference paper.

ISBN

9789898565037

Language

en