Loughborough University
Browse
Rangappa2019_Article_EstablishingThePerformanceOfLo.pdf (2.35 MB)

Establishing the performance of low-cost Lytro cameras for 3D coordinate geometry measurements

Download (2.35 MB)
journal contribution
posted on 2019-04-05, 11:34 authored by Shreedhar Rangappa, Ranveer S. Matharu, Jon PetzingJon Petzing, Peter KinnellPeter Kinnell
Lytro cameras are equipped to capture 3D information in one exposure without the need for structured illumination, allowing greyscale depth maps of the captured image to be created using the Lytro desktop software. These consumer-grade light-field cameras (Lytro) provide a cost-effective method of measuring the depth of multiple objects which is suitable for many applications. But, the greyscale depth maps generated using the Lytro cameras are in relative depth scale and hence not suitable for engineering applications where absolute depth is essential. In this research, camera control variables, environmental sensitivity, depth distortion characteristics, and the effective working range of first- and second-generation Lytro cameras were evaluated. In addition, a depth measuring technique to deliver 3D output depth maps represented in SI units (metres) is discussed in detail exhibiting the suitability of consumer-grade Lytro cameras suitability in metrological applications without significant modifications.

Funding

This research work is supported under Grant Reference Number EP/IO33467/1 and the support of Rolls-Royce plc.

History

School

  • Mechanical, Electrical and Manufacturing Engineering

Published in

Machine Vision and Applications

Volume

30

Issue

4

Pages

615–627

Citation

RANGAPPA, S. ... et al, 2019. Establishing the performance of low-cost Lytro cameras for 3D coordinate geometry measurements. Machine Vision and Applications, 30(4), pp 615–627.

Publisher

Springer © The Authors

Version

  • VoR (Version of Record)

Publisher statement

This work is made available according to the conditions of the Creative Commons Attribution 4.0 International (CC BY 4.0) licence. Full details of this licence are available at: http://creativecommons.org/licenses/ by/4.0/

Acceptance date

2019-02-10

Publication date

2019-03-04

Notes

This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

ISSN

0932-8092

eISSN

1432-1769

Language

  • en