Skip navigation

Multi-modal virtual scenario enhances neurofeedback learning

Multi-modal virtual scenario enhances neurofeedback learning

Cohen, Avihay, Keynan, Jackob N., Jackont, Gilan, Green, Nilli, Rashap, Iris, Shani, Ofir, Charles, Fred, Cavazza, Marc ORCID: 0000-0001-6113-9696, Hendler, Talma and Raz, Gal (2016) Multi-modal virtual scenario enhances neurofeedback learning. Frontiers in Robotics and AI, 3:52. pp. 1-11. ISSN 2296-9144 (Online) (doi:https://doi.org/10.3389/frobt.2016.00052)

[img]
Preview
PDF (Publisher's PDF - Open Access)
19810 CAVAZZA_Multi-modal_Virtual_Scenario_2016.pdf - Published Version
Available under License Creative Commons Attribution.

Download (2MB) | Preview

Abstract

In the past decade neurofeedback (NF) has become the focus of a growing body of research. With real-time functional magnetic resonance imaging (fMRI) enabling online monitoring of emotion-related areas, such as the amygdala, many have begun testing its therapeutic benefits. However, most existing NF procedures still use monotonic uni-modal interfaces, thus possibly limiting user engagement and weakening learning efficiency. The current study tested a novel multi-sensory NF animated scenario (AS) aimed at enhancing user experience and improving learning. We examined whether relative to a simple uni-modal 2D interface, learning via an interface of complex multi-modal 3D scenario will result in improved NF learning. As a neural-probe, we used the recently developed fMRI-inspired EEG model of amygdala activity (“amygdala-EEG finger print”; amygdala-EFP), enabling low-cost and mobile limbic NF training. Amygdala-EFP was reflected in the AS by the unrest level of a hospital waiting room in which virtual characters become impatient, approach the admission desk and complain loudly. Successful downregulation was reflected as an ease in the room unrest level. We tested whether relative to a standard uni-modal 2D graphic thermometer (TM) interface, this AS could facilitate more effective learning and improve the training experience. Thirty participants underwent two separated NF sessions (1 week apart) practicing downregulation of the amygdala-EFP signal. In the first session, half trained via the AS and half via a TM interface. Learning efficiency was tested by three parameters: (a) effect size of the change in amygdala-EFP following training, (b) sustainability of the learned downregulation in the absence of online feedback, and (c) transferability to an unfamiliar context. Comparing amygdala-EFP signal amplitude between the last and the first NF trials revealed that the AS produced a higher effect size. In addition, NF via the AS showed better sustainability, as indicated by a no-feedback trial conducted in session 2 and better transferability to a new unfamiliar interface. Lastly, participants reported that the AS was more engaging and more motivating than the TM. Together, these results demonstrate the promising potential of integrating realistic virtual environments in NF to enhance learning and improve user’s experience.

Item Type: Article
Additional Information: Copyright: © 2016 Cohen, Keynan, Jackont, Green, Rashap, Shani, Charles, Cavazza, Hendler and Raz. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
Uncontrolled Keywords: EEG–fMRI integration, EEG-neurofeedback, fMRI-neurofeedback, real-time fMRI, amygdala, emotion regulation, interface, virtual reality
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / Department / Research Group: Faculty of Liberal Arts & Sciences
Faculty of Liberal Arts & Sciences > School of Computing & Mathematical Sciences (CAM)
Last Modified: 26 Nov 2020 22:34
Selected for GREAT 2016: None
Selected for GREAT 2017: None
Selected for GREAT 2018: None
Selected for GREAT 2019: None
Selected for REF2021: None
URI: http://gala.gre.ac.uk/id/eprint/19810

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics