H. Chen, X. Liu, X. Li, H. Shi and G. Zhao, "Analyze Spontaneous Gestures for Emotional Stress State Recognition: A Micro-gesture Dataset and Analysis with Deep Learning," 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France, 2019, pp. 1-8. doi: 10.1109/FG.2019.8756513
Analyze spontaneous gestures for emotional stress state recognition : a micro-gesture dataset and analysis with deep learning
|Author:||Chen, Haoyu1; Liu, Xin1; Li, Xiaobai1;|
1Center for Machine Vision and Signal Analysis, University of Oulu, Finland
|Online Access:||PDF Full Text (PDF, 9.2 MB)|
|Persistent link:|| http://urn.fi/urn:nbn:fi-fe202003248953
Institute of Electrical and Electronics Engineers,
|Publish Date:|| 2020-03-24
Emotions are central for human intelligence and should have a similar role in AI. When it comes to emotion recognition, however, analysis cues for robots were mostly limited to human facial expressions and speech. As an alternative important non-verbal communicative fashion, the body gesture is proved to be capable of conveying emotional information which should gain more attention. Inspired by recent researches on micro-expressions, in this paper, we try to explore a specific group of gestures which are spontaneously and unconsciously elicited by inner feelings. These gestures are different from common gestures for facilitating communications or to express feelings on ones own initiative and always ignored in our daily life. This kind of subtle body movements is known as ’micro-gestures’ (MGs). Work of interpreting the human hidden emotions via these specific gestural behaviors in unconstrained situations, however, is limited. It is because of an unclear correspondence between body movements and emotional states which need multidisciplinary efforts from computer science, psychology, and statistic researchers. To fill the gap, we built a novel Spontaneous Micro-Gesture (SMG) dataset containing 3,692 manually labeled gesture clips. The data collection from 40 participants was conducted through a story-telling game with two emotional state settings. In this paper, we explored the emotional gestures with a sign-based measurement. To verify the latent relationship between emotional states and MGs, we proposed a framework that encodes the objective gestures to a Bayesian network to infer the subjective emotional states. Our experimental results revealed that, most of the participants would do ’micro-gestures’ spontaneously to relieve their mental strains. We also carried out a human test on ordinary and trained people for comparison. The performance of both our framework and human beings was evaluated on 142 testing instances (71 for each emotional state) by subject-independent testing. To authors’ best knowledge, this is the first presented MG dataset. Results showed that the proposed MG recognition method achieved promising performance. We also showed that MGs could be helpful cues for the recognition of hidden emotional states.
|Pages:||1 - 8|
14th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2019, 14-18 May 2019, Lille, France
IEEE International Conference on Automatic Face and Gesture Recognition
|Type of Publication:||
A4 Article in conference proceedings
|Field of Science:||
113 Computer and information sciences
This work was supported by the Academy of Finland, MiGA project (Grant No.316765), Tekes Fidipro Program (Grant No.1849/31/2015) and Business Finland project (Grant No. 3116/31/2017). Haoyu Chen is supported by China Scholarship Council. As well, the authors wish to acknowledge CSC IT Center for Science, Finland, for computational resources.
|Academy of Finland Grant Number:||
316765 (Academy of Finland Funding decision)
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.