Accessibility navigation

Affective brain–computer music interfacing

Daly, I., Williams, D., Kirke, A., Weaver, J., Malik, A., Hwang, F., Miranda, E. and Nasuto, S. J. (2016) Affective brain–computer music interfacing. Journal of Neural Engineering, 13 (4). 046022. ISSN 1741-2560

Text - Accepted Version
· Please see our End User Agreement before downloading.


It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1088/1741-2560/13/4/046022


We aim to develop and evaluate an affective brain–computer music interface (aBCMI) for modulating the affective states of its users. Approach. An aBCMI is constructed to detect a userʼs current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a casebased reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. Main results. The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p < 0.01) and modulate its userʼs affective states significantly above chance level (p < 0.05). Significance. Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to userʼs affective states. Possible applications include use in music therapy and entertainment

Item Type:Article
Divisions:Interdisciplinary Research Centres (IDRCs) > Centre for Integrative Neuroscience and Neurodynamics (CINN)
Life Sciences > School of Biological Sciences > Department of Bio-Engineering
ID Code:67219


Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation