Skip navigation

Contextual emotional transformer-based model for comment analysis in mental health case prediction

Contextual emotional transformer-based model for comment analysis in mental health case prediction

Ibitoye, Ayodeji O.J. ORCID logoORCID: https://orcid.org/0000-0002-5631-8507, Oladosu, Oladimeji O. and Onifade, Olufade F.W. (2024) Contextual emotional transformer-based model for comment analysis in mental health case prediction. Vietnam Journal of Computer Science. pp. 1-23. ISSN 2196-8888 (Print), 2196-8896 (Online) (doi:10.1142/S2196888824500192)

[thumbnail of Open Access Article]
Preview
PDF (Open Access Article)
49329 IBITOYE_Contextual_Emotional_Transformer-Based_Model_For_Comment_Analysis_In_Mental_Health_Case_Prediction_(OA)_2024.pdf - Published Version
Available under License Creative Commons Attribution.

Download (2MB) | Preview

Abstract

Mental health (MH) assessment and prediction have become critical areas of focus in healthcare, leveraging developments in natural language processing (NLP). Recent advancements in machine learning have facilitated the exploration of predictive models for MH based on user-generated comments that overlooked the integration of emotional attention mechanisms. The methods often struggle with contextual nuances and emotional subtleties, leading to suboptimal predictions. The prevailing challenge lies in accurately understanding the emotional context embedded within textual comments, which is crucial for effective prediction and intervention. In this research, we introduce a novel approach employing contextual emotional transformer-based models (CETM) for comment analysis in MH case prediction. CETM leverages state-of-the-art transformer architectures enhanced with contextual embedding layers and emotional attention mechanisms for MH case prediction. By incorporating contextual information and emotional cues, CETM captures the underlying emotional states and MH indicators expressed in user comments. Through extensive experimentation and evaluation, both Roberta and bidirectional encoder representations from transformers (BERT) models exhibited enhanced accuracy, precision, recall, and F1 scores compared to their counterparts lacking emotional attention. Notably, the Roberta model attained a greater accuracy of 94.5% when matched to BERT’s 87.6% when emotional attention was employed. Hence, by incorporating emotional context into the predictive model, we achieved significant improvements, which offers promising avenues for more precise and personalized MH interventions.

Item Type: Article
Uncontrolled Keywords: mental health, emotional analysis, Roberta, BERT, natural language processing, decision support
Subjects: B Philosophy. Psychology. Religion > BF Psychology
Q Science > Q Science (General)
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / School / Research Centre / Research Group: Faculty of Engineering & Science
Faculty of Engineering & Science > School of Computing & Mathematical Sciences (CMS)
Last Modified: 09 Jan 2025 15:47
URI: http://gala.gre.ac.uk/id/eprint/49329

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics