Virtual friend: tracking and generating natural interactive behaviours in real video
conference contributionposted on 08.12.2009 by Yue Zheng, Yulia Hicks, Dave Marshall, Juan C. Mostaza, Jonathon Chambers
Any type of content contributed to an academic conference, such as papers, presentations, lectures or proceedings.
The aim of our research is to create a “virtual friend” i.e., a virtual character capable of responding to actions obtained from observing a real person in video in a realistic and sensible manner. In this paper, we present a novel approach for generating a variety of complex behavioural responses for a fully articulated “virtual friend” in three dimensional (3D) space. Our approach is model-based. First of all, we train a collection of dual Hidden Markov Models (HMMs) on 3D motion capture (MoCap) data representing a number of interactions between two people. Secondly, we track 3D articulated motion of a single person in ordinary 2D video. Finally, using the dual HMM, we generate a moving “virtual friend” reacting to the motion of the tracked person and place it in the original video footage. In this paper, we describe our approach in depth as well as present the results of experiments, which show that the produced behaviours are very close to those of real people.
- Mechanical, Electrical and Manufacturing Engineering