MultimodalDatabaseofEmotionalSpeechVideoandGestures.pdf (4.94 MB)
Download fileMultimodal database of emotional speech, video and gestures
conference contribution
posted on 10.06.2019, 09:40 by Tomasz Sapinski, Dorota Kaminska, Adam Pelikant, Cagri Ozcinar, Egils Avots, Gholamreza AnbarjafariPeople express emotions through different modalities. Integration of verbal and non-verbal communication channels creates a system in which the message is easier to understand. Expanding the focus
to several expression forms can facilitate research on emotion recognition as well as human-machine interaction. In this article, the authors
present a Polish emotional database composed of three modalities: facial expressions, body movement and gestures, and speech. The corpora
contains recordings registered in studio conditions, acted out by 16 professional actors (8 male and 8 female). The data is labeled with six basic
emotions categories, according to Ekman’s emotion categories. To check
the quality of performance, all recordings are evaluated by experts and
volunteers. The database is available to academic community and might
be useful in the study on audio-visual emotion recognition.
Funding
This work is supported Estonian Research Council Grant (PUT638), the Scientific and Technological Research Council of Turkey (TUB ITAK) (Proje 1001 - 116E097), Estonian-Polish Joint Research Project, the Estonian Centre of Excellence in IT (EXCITE) funded by the European Regional Development Fund
History
School
- Loughborough University London