posted on 2019-10-09, 08:52authored byHaibin CaiHaibin Cai, Yinfeng Fang, Zhaojie Ju, Cristina Costescu, Daniel David, Erik Billing, Tom Ziemke, Serge Thill, Tony Belpaeme, Bram Vanderborght, David Vernon, Kathleen Richardson, Honghai Liu
It is evident that recently reported robot-assisted therapy systems for assessment of children with autism spectrum disorder (ASD) lack autonomous interaction abilities and require significant human resources. This paper proposes a sensing system that automatically extracts and fuses sensory features, such as body motion features, facial expressions, and gaze features, further assessing the children behaviors by mapping them to therapist-specified behavioral classes. Experimental results show that the developed system has a capability of interpreting characteristic data of children with ASD, thus has the potential to increase the autonomy of robots under the supervision of a therapist and enhance the quality of the digital description of children with ASD. The research outcomes pave the way to a feasible machine-assisted system for their behavior assessment.
Funding
EU Seventh Framework Program DREAM under Grant 611391