The conventional ‘keyboard and workstation’ approach allows complex medical image presentation and manipulation during mammographic interpretation. Nevertheless, providing rich interaction and feedback in real time for navigational training or computer assisted detection of disease remains a challenge. Through computer vision and state of the art AR (Augmented Reality) technique, this study proposes an ‘AR mammographic workstation’ approach which could support workstation-independent rich interaction and real-time feedback. This flexible AR approach explores the feasibility of facilitating various mammographic training scenes via AR as well as its limitations.
Published inCommunications in Computer and Information Science
Pages377 - 385
CitationTANG, Q., CHEN, Y. and GALE, A.G., 2017. Rich interaction and feedback supported mammographic training: A trial of an augmented reality approach. IN: Valdes Hernandez, M. and Gonzalez-Castro, V. (eds). Medical Image Understanding and Analysis, 21st Annual Conference, MIUA 2017, Edinburgh, UK, 11-13 July 2017, pp.377-385.
VersionAM (Accepted Manuscript)
Publisher statementThis work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/
NotesThis is a pre-copyedited version
of a contribution published in Valdes Hernandez, M. and Gonzalez-Castro, V. (eds). Medical Image Understanding and Analysis, 21st Annual Conference, MIUA 2017 published by Springer. The definitive authenticated version is available online via https://doi.org/10.1007/978-3-319-60964-5_33
Book seriesCommunications in Computer and Information Science;723