To aid an automatic taxiing system for unmanned aircraft, this paper presents a colour based method for semantic segmentation and image classification in an aerodrome environment with the intention to use the classification output to aid navigation and collision avoidance. Based on previous work, this machine vision system uses semantic segmentation to interpret the scene. Following an initial superpixel based segmentation procedure, a colour based Bayesian Network classifier is trained and used to semantically classify each segmented cluster. HSV colourspace is adopted as it is close to the way of human vision perception of the world, and each channel shows significant differentiation between classes. Luminance is used to identify surface lines on the taxiway, which is then fused with colour classification to give improved classification results. The classification performance of the proposed colour based classifier is tested in a real aerodrome, which demonstrates that the proposed method outperforms a previously developed texture only based method.
Funding
This work was supported by the U.K. Engineering and Physical Sciences Research Council (EPSRC) Autonomous and Intelligent Systems programme under the grant number
EP/J011525/1 with BAE Systems as the leading industrial partner.
History
School
Aeronautical, Automotive, Chemical and Materials Engineering
Department
Aeronautical and Automotive Engineering
Published in
International Conference on Unmanned Aircraft Systems (ICUAS'16)
2016 International Conference on Unmanned Aircraft Systems, ICUAS 2016
Pages
858 - 867
Citation
COOMBES, M., EATON, W.H. and CHEN, W.-H., 2016. Colour based semantic image segmentation and classification for unmanned ground operations. International Conference on Unmanned Aircraft Systems (ICUAS 2016), Arlington, VA USA, 7th-10th June 2016, pp. 858-867.