Skip navigation

Expression, Affect, Action Unit Recognition: Aff-Wild2, Multi-Task Learning and ArcFace

Expression, Affect, Action Unit Recognition: Aff-Wild2, Multi-Task Learning and ArcFace

Kollias, Dimitrios ORCID logoORCID: https://orcid.org/0000-0002-8188-3751 and Zafeiriou, Stefanos (2019) Expression, Affect, Action Unit Recognition: Aff-Wild2, Multi-Task Learning and ArcFace. In: 30th British Machine Vision Conference 2019, BMVC 2019, September 9-12, 2019, Cardiff, UK.

[thumbnail of Publisher's PDF - Open Access]
Preview
PDF (Publisher's PDF - Open Access)
29427 KOLLIAS_Expression_Affect_Action_Unit_Recognition_2019.pdf - Published Version

Download (3MB) | Preview

Abstract

Affective computing has been largely limited in terms of available data resources. The need to collect and annotate diverse in-the-wild datasets has become apparent with the rise of deep learning models, as the default approach to address any computer vision task. Some in-the-wild databases have been recently proposed. However: i) their size is small, ii) they are not audiovisual, iii) only a small part is manually annotated, iv) they contain a small number of subjects, or v) they are not annotated for all main behavior tasks (valence-arousal estimation, action unit detection and basic expression classification). To address these, we substantially extend the largest available in-the-wild database (Aff-Wild) to study continuous emotions such as valence and arousal. Furthermore, we annotate parts of the database with basic expressions and action units. As a consequence, for the first time, this allows the joint study of all three types of behavior states. We call this database Aff-Wild2. We conduct extensive experiments with CNN and CNN-RNN architectures that use visual and audio modalities; these networks are trained on Aff-Wild2 and their performance is then evaluated on 10 publicly available emotion databases. We show that the networks achieve state-of-the-art performance for the emotion recognition tasks. Additionally, we adapt the ArcFace loss function in the emotion recognition context and use it for training two new networks on Aff-Wild2 and then re-train them in a variety of diverse expression recognition databases. The networks are shown to improve the existing state-of-the-art. The database, emotion recognition models and source code are available at http://ibug.doc.ic.ac.uk/resources/aff-wild2.

Item Type: Conference or Conference Paper (Paper)
Additional Information: © 2019. The copyright of this document resides with its authors. It may be distributed unchanged freely in print or electronic forms.
Faculty / School / Research Centre / Research Group: Faculty of Engineering & Science > School of Computing & Mathematical Sciences (CMS)
Faculty of Engineering & Science
Last Modified: 04 Mar 2022 13:06
URI: http://gala.gre.ac.uk/id/eprint/29427

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics