Skip to content
1981
Extended Senses: Embodying Technology
  • ISSN: 2397-9704
  • E-ISSN: 2397-9712

Abstract

This article considers gestural and motion-based interaction within the context of body-based audio-visual performance. More specifically the article describes some general commonalities between different approaches to both large-area and small-area interaction. This is discussed in relation to my own work with motion-tracking technologies (The Gesture and Media System [GAMS]), as well as more recent experiments with the Leap Motion hand/gesture tracker. In addition, the article references other strategies for gestural interaction such as a formal ‘method for mapping embodied gesture, acquired with electromyography and motion sensing’ (Zbyszynski et al. 2019) and a more human-centred ‘expansive and augmented performance environment that facilitates full-body musical interactions’ (Altosaar et al. 2019). The article posits that while precise gestural mapping approaches are not universally applicable, enough commonalities between gestural interaction strategies exist that some insights can be generally applied to gestural performance, regardless of the technology employed or the scale of the interaction space.

Loading

Article metrics loading...

/content/journals/10.1386/vcr_00062_1
2023-03-20
2026-04-13

Metrics

Loading full text...

Full text loading...

References

  1. Altosaar, R., Tindale, A. and Doyle, J. (2019), ‘Physically colliding with music: Full-body interactions with an audio-only virtual reality interface’, in Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’19), Tempe, AZ, USA, 17–20 March, New York: Association for Computing Machinery, pp. 55357, https://doi.org/10.1145/3294109.3301256. Accessed 31 March 2021.
    [Google Scholar]
  2. Beverley Boy Productions (2021), ‘What is “Mickey Mousing” in film music?’, Beverley Boy Productions, https://beverlyboy.com/filmmaking/what-is-mickey-mousing-in-film-music/. Accessed 1 June 2022.
    [Google Scholar]
  3. Gibson, S. (2018), ‘Opto-Phono-Kinesia (OPK): Designing motion-based interaction for expert performers’, Twelfth International Conference on Tangible, Embedded and Embodied Interactions (TEI 2018), Stockholm, Sweden, March, New York: Association for Computing Machinery, pp. 48792, https://dl.acm.org/authorize.cfm?key=N43393. Accessed 18 March 2018.
    [Google Scholar]
  4. Gibson, S. (2020a), ‘OPK03 endshort Northern Dance’, Vimeo, https://vimeo.com/428132620. Accessed 15 January 2020.
    [Google Scholar]
  5. Gibson, S. (2020b), ‘OPK03 midshort Northern Dance’, Vimeo, https://vimeo.com/428160139. Accessed 15 January 2020.
    [Google Scholar]
  6. Gibson, S. (2020c), ‘OPK06 Solomon Lennox Northern Dance’, Vimeo, https://vimeo.com/385116415. Accessed 15 January 2020.
    [Google Scholar]
  7. Gibson, S. (2020d), ‘OPK06 Northern Dance CAPTIONED’, Vimeo, https://vimeo.com/372366196. Accessed 15 January 2020.
    [Google Scholar]
  8. Gibson, S. (2021), ‘Being formal without being a formalist’, Leonardo, 54:6, pp. 62530, https://doi.org/10.1162/leon_a_02056. Accessed 3 January 2022.
    [Google Scholar]
  9. Gibson, S. (2022), ‘Virtual AV 01 melody’, Vimeo, https://vimeo.com//723023823. Accessed 15 January 2022.
    [Google Scholar]
  10. Mulder, A. G. E. (1994), ‘Virtual musical instruments: Accessing the sound synthesis universe as a performer’, in Proceedings of the First Brazilian Symposium on Computer Music, Caxambu, Minas Gerais, Brazil, 24 August, Belo Horizonte: Universidade Federal de Minas Gerais, pp. 24350.
    [Google Scholar]
  11. Vetere, F. (1997), ‘Redundancy in multimedia systems’, in S. Howard, J. Hammond and G. Lindgaard (eds), Human-Computer Interaction INTERACT ’97: IFIP – The International Federation for Information Processing, Boston, MA: Springer, pp. 64850, https://doi.org/10.1007/978-0-387-35175-9_116. Accessed 15 June 2022.
    [Google Scholar]
  12. Zbyszynski, M., di Donato, B. and Tanaka, A. (2019), ‘Gesture-timbre space: Multidimensional feature mapping using machine learning & concatenative synthesis’, 14th International Symposium on Computer Music Multidisciplinary Research (CMMR), Marseille, France, 1418 October.
    [Google Scholar]
  13. Gibson, Steve (2022), ‘Gestural interaction commonalities in body-based performance’, Virtual Creativity, Special Issue: ‘Extended Senses: Embodying Technology’, 12:1, pp. 7587, https://doi.org/10.1386/vcr_00062_1
    [Google Scholar]
/content/journals/10.1386/vcr_00062_1
Loading
/content/journals/10.1386/vcr_00062_1
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a success
Invalid data
An error occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test