Affective brain–computer music interfacingDaly, I., Williams, D., Kirke, A., Weaver, J., Malik, A., Hwang, F. ORCID: https://orcid.org/0000-0002-3243-3869, Miranda, E. and Nasuto, S. J. (2016) Affective brain–computer music interfacing. Journal of Neural Engineering, 13 (4). 046022. ISSN 1741-2560
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1088/1741-2560/13/4/046022 Abstract/SummaryWe aim to develop and evaluate an affective brain–computer music interface (aBCMI) for modulating the affective states of its users. Approach. An aBCMI is constructed to detect a userʼs current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a casebased reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. Main results. The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p < 0.01) and modulate its userʼs affective states significantly above chance level (p < 0.05). Significance. Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to userʼs affective states. Possible applications include use in music therapy and entertainment
Download Statistics DownloadsDownloads per month over past year Altmetric Funded Project Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |