Skip navigation

An interactive music playlist generator that responds to user emotion and context

An interactive music playlist generator that responds to user emotion and context

Griffiths, Darryl, Cunningham, Stuart and Weinel, Jonathan ORCID logoORCID: https://orcid.org/0000-0001-5347-3897 (2016) An interactive music playlist generator that responds to user emotion and context. In: Electronic Visualisation and the Arts (EVA) London, UK, 12 - 14 July 2016. Electronic Workshops in Computing (eWiC) | BCS . BCS The Chartered Institute for IT, Swindon; London, pp. 275-276. ISBN 178017344X ; 978-1780173443 (doi:10.14236/ewic/EVA2016.53)

[thumbnail of Author's published manuscript]
Preview
PDF (Author's published manuscript)
34077_WEINEL_An_interactive_music_playlist.pdf - Published Version
Available under License Creative Commons Attribution.

Download (293kB) | Preview

Abstract

This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood. At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980). Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.

Item Type: Conference Proceedings
Title of Proceedings: Electronic Visualisation and the Arts (EVA) London, UK, 12 - 14 July 2016
Uncontrolled Keywords: music playlists, affective computing, context-aware computing
Subjects: M Music and Books on Music > M Music
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / School / Research Centre / Research Group: Faculty of Engineering & Science > School of Computing & Mathematical Sciences (CMS)
Faculty of Liberal Arts & Sciences > Sound-Image Research Group
Faculty of Engineering & Science
Related URLs:
Last Modified: 04 Mar 2022 13:07
URI: http://gala.gre.ac.uk/id/eprint/34077

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics