Loughborough University
Browse
09195813.pdf (1.35 MB)

Cross-subject multimodal emotion recognition based on hybrid fusion

Download (1.35 MB)
journal contribution
posted on 2020-12-04, 12:29 authored by Yucel Cimtay, Erhan Ekmekcioglu, Seyma Caglar Ozhan
Multimodal emotion recognition has gained traction in affective computing research community to overcome the limitations posed by the processing a single form of data and to increase recognition robustness. In this study, a novel emotion recognition system is introduced, which is based on multiple modalities including facial expressions, galvanic skin response (GSR) and electroencephalogram (EEG). This method follows a hybrid fusion strategy and yields a maximum one-subject-out accuracy of 81.2% and a mean accuracy of 74.2% on our bespoke multimodal emotion dataset (LUMED-2) for 3 emotion classes: sad, neutral and happy. Similarly, our approach yields a maximum one-subject-out accuracy of 91.5% and a mean accuracy of 53.8% on the Database for Emotion Analysis using Physiological Signals (DEAP) for varying numbers of emotion classes, 4 in average, including angry, disgust, afraid, happy, neutral, sad and surprised. The presented model is particularly useful in determining the correct emotional state in the case of natural deceptive facial expressions. In terms of emotion recognition accuracy, this study is superior to, or on par with, the reference subject-independent multimodal emotion recognition studies introduced in the literature.

Funding

This work was supported by an Institutional Links grant, ID 352175665, under the Newton - Katip Celebi partnership between the UK and Turkey. The grant is funded by the UK Department of Business, Energy and Industrial Strategy (BEIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) and delivered by the British Council. For further information, please visit www.newtonfund.ac.uk.

History

School

  • Loughborough University London

Published in

IEEE Access

Volume

8

Pages

168865 - 168878

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Version

  • VoR (Version of Record)

Rights holder

© The Authors

Publisher statement

This is an Open Access Article. It is published by IEEE under the Creative Commons Attribution 4.0 International Licence (CC BY). Full details of this licence are available at: https://creativecommons.org/licenses/by/4.0/

Acceptance date

2020-09-07

Publication date

2020-09-14

Copyright date

2020

ISSN

2169-3536

eISSN

2169-3536

Language

  • en

Depositor

Dr Erhan Ekmekcioglu. Deposit date: 15 September 2020

Usage metrics

    Loughborough Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC