Skip to content
1981
Volume 1, Issue 1
  • ISSN: 3050-2942
  • E-ISSN: 3050-2950

Abstract

Brain–computer interfacing (BCI) offers novel methods to facilitate participation in music production, providing access for individuals who might otherwise be unable to take part (either due to lack of training or physical disability). This article describes the development of a BCI system for conscious or ‘active’, control of parameters on an audio mixer by generation of synchronous MIDI Machine Control messages. The mapping between neurophysiological cues and audio parameters must be intuitive for a neophyte audience (i.e. one without prior training or the physical skills developed by professional audio engineers when working with tactile interfaces). The prototype pilot system, dubbed MINDMIX (a portmanteau of ‘mind’ and ‘mixer’), was subsequently evaluated by neophyte and experienced music producers across utility and mapping congruency. Neophyte participants rated the system higher in utility than experienced participants, whilst both groups rated mapping congruency similarly. Assuming a degree of synonymy between utility and usefulness, and between congruency and intuitiveness, this suggests that whilst the system might be useful for the neophyte audience, experienced users are likely to exhibit a preference for existing technology over the MINDMIX system. In the future, specific evaluation of discrete mappings would be useful for iterative system design.

Loading

Article metrics loading...

/content/journals/10.1386/jmpr_00006_1
2025-10-29
2026-04-20

Metrics

Loading full text...

Full text loading...

/deliver/fulltext/jmpr/1/1/jmpr.1.1.71_Williams.html?itemId=/content/journals/10.1386/jmpr_00006_1&mimeType=html&fmt=ahah

References

  1. Aftanas, L. I. and Golocheikine, S. A. (2001), ‘Human anterior and frontal midline theta and lower alpha reflect emotionally positive state and internalized attention: High-resolution EEG investigation of meditation’, Neuroscience Letters, 310:1, pp. 5760, https://doi.org/10.1016/s0304-3940(01)02094-8.
    [Google Scholar]
  2. Allen, J. J. B. and Kline, J. P. (2004), ‘Frontal EEG asymmetry, emotion, and psychopathology: The first, and the next 25 years’, Biological Psychology, 67:1–2, pp. 15, https://doi.org/10.1016/j.biopsycho.2004.03.001.
    [Google Scholar]
  3. Baier, G., Hermann, T. and Stephani, U. (2007), ‘Event-based sonification of EEG rhythms in real time’, Clinical Neurophysiology: Official Journal of the International Federation of Clinical Neurophysiology, 118:6, pp. 137786, https://doi.org/10.1016/j.clinph.2007.01.025.
    [Google Scholar]
  4. Berndt, A. (2009), Musical Nonlinearity in Interactive Narrative Environments, Ann Arbor, MI: Michigan Publishing, University of Michigan Library.
    [Google Scholar]
  5. Bhardwaj, S., Jadhav, P., Adapa, B., Acharyya, A. and Naik, G. R. (2015), ‘Online and automated reliable system design to remove blink and muscle artefact in EEG’, Annual International Conference IEE Engineering and Medical Biological Society, 25–29 August, IEEE, pp. 678487, https://doi.org/10.1109/EMBC.2015.7319951.
    [Google Scholar]
  6. Bigand, E. and Poulin-Charronnat, B. (2006), ‘Are we “experienced listeners”? A review of the musical capacities that do not depend on formal musical training’, Cognition, 100:1, pp. 10030.
    [Google Scholar]
  7. Brouwer, A. M. and van Erp, J. B. (2010), ‘A tactile P300 brain-computer interface’, Frontiers in Neuroscience, 4, https://doi.org/10.3389/fnins.2010.00019.
    [Google Scholar]
  8. de Campo, A., Hoeldrich, R., Eckel, G. and Wallisch, A. (2007), ‘New sonification tools for EEG data screening and monitoring’, in Proceedings of the 13th International Conference on Auditory Display, Montréal, Canada, 26–29 June, Atlanta, GA: Georgia Tech Sonification Lab, pp. 53642.
    [Google Scholar]
  9. Cartwright, M., Pardo, B. and Reiss, J. D. (2014), ‘Mixploration: Rethinking the audio mixer interface’, in T. Kifik and O. Stock (eds), Proceedings of the 19th International Conference on Intelligent User Interfaces, New York: ACM, pp. 36570.
    [Google Scholar]
  10. Chew, Y. C. and Caspary, E. (2011), ‘MusEEGk: A brain–computer musical interface’, in D. Tan (ed.), Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May, Vancouver: ACM, pp. 141722.
    [Google Scholar]
  11. Daly, I. et al. (2015), ‘Towards human-computer music interaction: Evaluation of an affectively-driven music generator via galvanic skin response measures’, in 2015 7th Computer Science and Electronic Engineering Conference (CEEC), Colchester, UK, pp. 8792, https://doi.org/10.1109/CEEC.2015.7332705.
    [Google Scholar]
  12. Daly, I., Williams, D., Kirke, A., Weaver, J., Malik, A., Hwang, F., Miranda, E. and Nasuto, S. J. (2016), ‘Affective brain–computer music interfacing’, Journal of Neural Engineering, 13:4, https://doi.org/10.1088/1741-2560/13/4/046022.
    [Google Scholar]
  13. Dornhege, G. (ed.) (2007), Toward Brain-Computer Interfacing, Cambridge, MA: MIT Press.
    [Google Scholar]
  14. Faul, F., Erdfelder, E., Lang, A. G. and Buchner, A. (2007), ‘G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences’, Behavior Research Methods, 39:2, pp. 17591.
    [Google Scholar]
  15. Feis, R. A., Smith, S. M., Filippini, N., Douaud, G., Dopper, E. G. P., Heise, V., Trachtenberg, A. J., van Swieten, J. C., van Buchem, M. A., Rombouts, S. A. R. B. and Mackay, C. E. (2015), ‘ICA-based artifact removal diminishes scan site differences in multi-centre resting-state fMRI’, Frontiers in Neuroscience, 9, https://doi.org/10.3389/fnins.2015.00395.
    [Google Scholar]
  16. Goudeseune, C. (2002), ‘Interpolated mappings for musical instruments’, Organised Sound, 7:2, pp. 8596.
    [Google Scholar]
  17. Grierson, M. and Kiefer, C. (2011), ‘Better brain interfacing for the masses: Progress in event-related potential detection using commercial brain computer interfaces’, in CHI'11 Extended Abstracts on Human Factors in Computing Systems, New York: Association for Computing Machinery, pp. 168186.
    [Google Scholar]
  18. Hinterberger, T. and Baier, G. (2005), ‘Parametric orchestral sonification of EEG in real time’, IEEE MultiMedia, 12:2, pp. 7079.
    [Google Scholar]
  19. Hunt, A. and Kirk, R. (2000), ‘Mapping strategies for musical performance’, Trends in Gestural Control of Music, 21, pp. 23158.
    [Google Scholar]
  20. Huron, D. (1997), ‘Humdrum and Kern: Selective feature encoding’, in E. Selfridge-Field (ed.), Beyond MIDI: The Handbook of Musical Codes, Cambridge, MA: MIT Press, pp. 375401.
    [Google Scholar]
  21. Jadhav, P., Shanamugan, D., Chourasia, A., Ghole, A., Acharyya, A. and Naik, G. (2014), ‘Automated detection and correction of eye blink and muscular artefacts in EEG signal for analysis of autism spectrum disorder’, Annual International Conference IEE Engineering and Medical Biological Society, New York: IEEE, pp. 188184, https://doi.org/10.1109/EMBC.2014.6943977.
    [Google Scholar]
  22. Knapp, R. B. and Lusted, H. S. (1990), ‘A bioelectric controller for computer music applications’, Computer Music Journal, 14:1, https://doi.org/10.2307/3680115.
    [Google Scholar]
  23. Lech, M. and Kostek, B. (2010), ‘Gesture-based computer control system applied to the interactive whiteboard’, Proceedings of the 2010 2nd International Conference on Information Technology (ICIT), Gdansk, Poland, 28–30 June, Gdansk: IEEE, pp. 7578.
    [Google Scholar]
  24. Liu, A. K., Dale, A. M. and Belliveau, J. W. (2002), ‘Monte Carlo simulation studies of EEG and MEG localization accuracy’, Human Brain Mapping, 16:1, pp. 4762, https://doi.org/10.1002/hbm.10024.
    [Google Scholar]
  25. McFarland, D. J., Miner, L. A., Vaughan, T. M. and Wolpaw, J. R. (2000), ‘Mu and beta rhythm topographies during motor imagery and actual movements’, Brain Topography, 12:3, pp. 17786, https://doi.org/10.1023/a:1023437823106.
    [Google Scholar]
  26. Merchel, S., Altinsoy, E. and Stamm, M. (2010), ‘Tactile music instrument recognition for audio mixers’, Audio Engineering Society Convention, Audio Engineering Society, London, 22–25 May.
    [Google Scholar]
  27. Merchel, S., Altinsoy, M. E. and Stamm, M. (2012), ‘Touch the sound: Audio-driven tactile feedback for audio mixing applications’, Journal of the Audio Engineering Society, 60:1–2, pp. 4753.
    [Google Scholar]
  28. Middendorf, M., McMillan, G., Calhoun, G. and Jones, K. S. (2000), ‘Brain-computer interfaces based on the steady-state visual-evoked response’, IEEE Transactions on Rehabilitation Engineering, 8:2, pp. 21114, https://doi.org/10.1109/86.847819.
    [Google Scholar]
  29. Miranda, E. R. (2010), ‘Plymouth brain-computer music interfacing project: From EEG audio mixers to composition informed by cognitive neuroscience’, International Journal of Arts and Technology, 3:2–3, https://doi.org/10.1504/ijart.2010.032562.
    [Google Scholar]
  30. Miranda, E. R. and Castet, J. (eds) (2014), Guide to Brain–Computer Music Interfacing, London: Springer.
    [Google Scholar]
  31. Miranda, E. R., Magee, W. L., Wilson, J. J., Eaton, J. and Palaniappan, R. (2011), ‘Brain-computer music interfacing (BCMI): From basic research to the real world of special needs’, Music and Medicine, 3:3, pp. 13440, https://doi.org/10.1177/1943862111399290.
    [Google Scholar]
  32. Müller-Putz, G. R., Scherer, R., Brauneis, C. and Pfurtscheller, G. (2005), ‘Steady-state visual evoked potential (SSVEP)-based communication: Impact of harmonic frequency components’, Journal of Neural Engineering, 2:4, pp. 12330, https://doi.org/10.1088/1741-2560/2/4/008.
    [Google Scholar]
  33. Nam, C. S., Nijholt, A. and Lotte, F. (eds) (2018), Brain–Computer Interfaces Handbook: Technological and Theoretical Advances, Boca Raton, FL: CRC Press.
    [Google Scholar]
  34. Neuper, C., Müller-Putz, G. R., Scherer, R. and Pfurtscheller, G. (2006), ‘Motor imagery and EEG-based control of spelling devices and neuroprostheses’, Progress in Brain Research, 159, pp. 393409.
    [Google Scholar]
  35. Nicolaou, N., Malik, A., Daly, I., Weaver, J., Hwang, F., Kirke, A., Roesch, E. B., Williams, D., Miranda, E. R. and Nasuto, S. J. (2017), ‘Directed motor-auditory EEG connectivity is modulated by music tempo’, Frontiers in Human Neuroscience, 11, https://doi.org/10.3389/fnhum.2017.00502.
    [Google Scholar]
  36. Niedermeyer, E. and da Silva, F. H. L. (2005), Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, Philadelphia, PA: Lippincott Williams and Wilkins.
    [Google Scholar]
  37. Nielsen, J. (1993), Usability Engineering, Boston, MA: Academic Press.
    [Google Scholar]
  38. Picinali, L. and Katz, B. F. (2010), ‘Spectral discrimination thresholds comparing audio and haptics’, in C. Magnusson, D. Szymczak and S. Brewster (eds), Proceedings of Haptic and Auditory Interaction Design Workshop, Copenhagen, 23–24 August, Lund: Springer, https://doi.org/10.1007/978-3-642-32796-4.
    [Google Scholar]
  39. Ramirez, R. and Vamvakousis, Z. (2012), ‘Detecting emotion from EEG signals using the Emotive Epoc device’, International Conference on Brain Informatics, 4 December, Berlin and Heidelberg: Springer, pp. 17584.
    [Google Scholar]
  40. Ratcliffe, J. (2014), ‘Motionmix: A gestural audio mixing controller’, 137th Audio Engineering Society Convention, Los Angeles, CA, USA, 9–12 October, Los Angeles, CA: Audio Engineering Society.
    [Google Scholar]
  41. Toharia, P., Morales, J., de Juan, O., Fernaud, I., Rodríguez, A. and DeFelipe, J. (2014), ‘Musical representation of dendritic spine distribution: A new exploratory tool’, Neuroinformatics, 12:2, pp. 34153.
    [Google Scholar]
  42. Vogt, K. (2011), ‘A quantitative evaluation approach to sonifications’, Proceedings of the 17th International Conference on Auditory Display (ICAD 2011), CD-ROM, Budapest, Hungary, 20–23 June.
    [Google Scholar]
  43. Wakefield, J., Dewey, C. and Gale, W. (2017), ‘LAMI: A gesturally controlled three-dimensional stage Leap (motion-based) audio mixing interface’, 142nd Audio Engineering Society Convention, Berlin, Germany, 20–23 May, Berlin: Audio Engineering Society.
    [Google Scholar]
  44. Williams, D. (2016), ‘Utility versus creativity in biomedical musification’, Journal of Creative Music Systems, 1:1, https://doi.org/10.5920/jcms.2016.02.
    [Google Scholar]
  45. Williams, D. A. H. and Miranda, E. R. (2018), ‘BCI for music making: Then, now, and next’, in C. S. Nam, A. Nijholt and F. Lotte (eds), Brain-Computer Interfaces Handbook: Technological and Theoretical Advances, CRC Press, pp. 191205, https://www.crcpress.com/Brain-Computer-Interfaces-Handbook-Technological-and-Theoretical-Advances/Nam-Nijholt-Lotte/p/book/9781498773430. Accessed 1 October 2025.
    [Google Scholar]
  46. Wu, D., Li, C., Yin, Y., Zhou, C. and Yao, D. (2010), ‘Music composition from the brain signal: representing the mental state by music’, Computational Intelligence and Neuroscience, n.pag., https://doi.org/10.1155/2010/267671.
    [Google Scholar]
  47. Allen, J. J. B., Coan, J. A. and Nazarian, M. (2004), ‘Issues and assumptions on the road from raw signals to metrics of frontal EEG asymmetry in emotion’, Biological Psychology, 67:1–2, pp. 183218.
    [Google Scholar]
  48. Gelineck, S. and Serafin, S. (2009), ‘A quantitative evaluation of the differences between knobs and sliders’, Proceedings of the International Conference on New Interfaces for Musical Expression, Pittsburgh, PA, USA, 3–6 June, Pittsburgh, PA, pp. 1318.
    [Google Scholar]
  49. Lay-Ekuakille, A., Vergallo, P., Caratelli, D., Conversano, F., Casciaro, S. and Trabacca, A. (2013), ‘Multispectrum approach in quantitative EEG: Accuracy and physical effort’, IEEE Sensors Journal, 13:9, pp. 333140, https://doi.org/10.1109/jsen.2013.2271478.
    [Google Scholar]
  50. MacDonald, R., Kreutz, G. and Mitchell, L. (2012), ‘What is music, health, and wellbeing and why is it important’, in R. MacDonald, G. Kreutz and L. Mitchell (eds), Music, Health, Wellbeing, Oxford: Oxford University Press, pp. 311.
    [Google Scholar]
/content/journals/10.1386/jmpr_00006_1
Loading
/content/journals/10.1386/jmpr_00006_1
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a success
Invalid data
An error occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test