SMG : a micro-gesture dataset towards spontaneous body gestures for emotional stress state analysis |
|
Author: | Chen, Haoyu1; Shi, Henglin1; Liu, Xin2; |
Organizations: |
1Center for Machine Vision and Signal Analysis (CMVS), University of Oulu, Oulu, Finland 2Computer Vision and Pattern Recognition Laboratory, School of Engineering Science, Lappeenranta-Lahti University of Technology LUT, Lappeenranta, Finland |
Format: | article |
Version: | published version |
Access: | open |
Online Access: | PDF Full Text (PDF, 2.4 MB) |
Persistent link: | http://urn.fi/urn:nbn:fi-fe2023040334624 |
Language: | English |
Published: |
Springer Nature,
2023
|
Publish Date: | 2023-04-03 |
Description: |
AbstractWe explore using body gestures for hidden emotional state analysis. As an important non-verbal communicative fashion, human body gestures are capable of conveying emotional information during social communication. In previous works, efforts have been made mainly on facial expressions, speech, or expressive body gestures to interpret classical expressive emotions. Differently, we focus on a specific group of body gestures, called micro-gestures (MGs), used in the psychology research field to interpret inner human feelings. MGs are subtle and spontaneous body movements that are proven, together with micro-expressions, to be more reliable than normal facial expressions for conveying hidden emotional information. In this work, a comprehensive study of MGs is presented from the computer vision aspect, including a novel spontaneous micro-gesture (SMG) dataset with two emotional stress states and a comprehensive statistical analysis indicating the correlations between MGs and emotional states. Novel frameworks are further presented together with various state-of-the-art methods as benchmarks for automatic classification, online recognition of MGs, and emotional stress state recognition. The dataset and methods presented could inspire a new way of utilizing body gestures for human emotion understanding and bring a new direction to the emotion AI community. The source code and dataset are made available: https://github.com/mikecheninoulu/SMG. see all
|
Series: |
International journal of computer vision |
ISSN: | 0920-5691 |
ISSN-E: | 1573-1405 |
ISSN-L: | 0920-5691 |
Volume: | 131 |
Pages: | 1346 - 1366 |
DOI: | 10.1007/s11263-023-01761-6 |
OADOI: | https://oadoi.org/10.1007/s11263-023-01761-6 |
Type of Publication: |
A1 Journal article – refereed |
Field of Science: |
113 Computer and information sciences |
Subjects: | |
Funding: |
This work was supported by the Academy of Finland for Academy Professor project EmotionAI (Grants 336116, 345122), project MiGA (grant 316765), the University of Oulu & The Academy of Finland Profi 7 (grant 352788), Postdoc project 6+E (Grant 323287) and ICT 2023 project (grant 328115), and by Ministry of Education and Culture of Finland for AI forum project. As well, the authors wish to acknowledge CSC - IT Center for Science, Finland, for computational resources. |
Academy of Finland Grant Number: |
336116 345122 316765 323287 328115 |
Detailed Information: |
336116 (Academy of Finland Funding decision) 345122 (Academy of Finland Funding decision) 316765 (Academy of Finland Funding decision) 323287 (Academy of Finland Funding decision) 328115 (Academy of Finland Funding decision) |
Copyright information: |
© The Author(s) 2023. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
https://creativecommons.org/licenses/by/4.0/ |