posted on 2009-12-08, 10:10authored byYue Zheng, Yulia Hicks, Dave Marshall, Juan C. Mostaza, Jonathon Chambers
The aim of our research is to create a “virtual
friend” i.e., a virtual character capable of responding
to actions obtained from observing a real person in
video in a realistic and sensible manner. In this paper,
we present a novel approach for generating a variety
of complex behavioural responses for a fully articulated
“virtual friend” in three dimensional (3D) space.
Our approach is model-based. First of all, we train a
collection of dual Hidden Markov Models (HMMs) on
3D motion capture (MoCap) data representing a number
of interactions between two people. Secondly, we
track 3D articulated motion of a single person in
ordinary 2D video. Finally, using the dual HMM, we
generate a moving “virtual friend” reacting to the
motion of the tracked person and place it in the
original video footage. In this paper, we describe our
approach in depth as well as present the results of
experiments, which show that the produced behaviours
are very close to those of real people.
History
School
Mechanical, Electrical and Manufacturing Engineering
Citation
ZHENG, Y. ... et al., 2006. Virtual friend: tracking and generating natural interactive behaviours in real video. IN: 2006 8th International Conference on Signal Processing (ICSP 2006), Guilin, China, Vol.2., 16-20 November.