Skip navigation

EEG-based emotion classification using deep capsule networks for subject-independent and dependent scenarios

EEG-based emotion classification using deep capsule networks for subject-independent and dependent scenarios

Aadam, Aadam, Tu, Shanshan, Halim, Zahid, Waqas, Muhammad ORCID logoORCID: https://orcid.org/0000-0003-0814-7544, Alfuhaid, Hisham and Fatima, Ghulam (2026) EEG-based emotion classification using deep capsule networks for subject-independent and dependent scenarios. IEEE Transactions on Affective Computing. pp. 1-18. ISSN 1949-3045 (Online) (doi:10.1109/TAFFC.2026.3674037)

[thumbnail of Author's Accepted Manuscript]
Preview
PDF (Author's Accepted Manuscript)
53299 WAQAS_EEG-Based_Emotion_Classification_Using_Deep_Capsule_Networks_(AAM)_2026.pdf - Accepted Version

Download (1MB) | Preview

Abstract

Emotion recognition from electroencephalography (EEG) signals is an important component of emotionally intelligent human–computer interaction systems. However, existing approaches often rely on handcrafted features or conventional deep learning architectures that struggle to generalize across subjects due to the high variability of EEG signals. Most prior studies focus primarily on binary emotion classification and provide limited investigation of more complex multi-class scenarios. This work presents EmoCaps, an end-to end deep learning framework based on a capsule network with a self-attention–guided routing mechanism to learn discriminative representations directly from raw EEG signals. The proposed model is evaluated on the DEAP dataset under both subject dependent and subject-independent settings and across binary and multi-class emotion recognition tasks involving valence, arousal, and dominance dimensions. Experimental results demonstrate that EmoCaps consistently outperforms several representative deep learning models. In subject-dependent binary classification, the proposed approach achieves accuracies of up to 97%, while in the more challenging subject-independent setting it exceeds 80% accuracy across emotional dimensions. The framework also achieves strong performance in four-class and eight-class emotion recognition tasks and provides, to the best of our knowledge, the first reported results for subject independent multi-class emotion recognition on this dataset. Although the capsule-based architecture introduces higher computational cost, the proposed model significantly improves robustness and generalization across subjects. These results highlight the potential of EmoCaps for real-world emotion-aware applications in healthcare, adaptive learning, and affective computing systems.

Item Type: Article
Additional Information: This is the author’s accepted manuscript of an article accepted for publication in IEEE Transactions on Affective Computing. The final version is available via IEEE Xplore at https://doi.org/10.1109/TAFFC.2026.3674037. © 20XX IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses.
Uncontrolled Keywords: EEG-based emotion recognition, capsule networks, subject-independent classification, Deep Learning, affective computing, DEAP Dataset
Subjects: Q Science > Q Science (General)
Q Science > QA Mathematics
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / School / Research Centre / Research Group: Faculty of Engineering & Science
Faculty of Engineering & Science > School of Computing & Mathematical Sciences (CMS)
Last Modified: 30 Apr 2026 15:44
URI: https://gala.gre.ac.uk/id/eprint/53299

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics