Academic audience and musicologists
This research sits at the intersection of clusters which seldom meet. It will invigorate the field of historical musicology, performance studies, and especially Early Music (performance and musicology), by opening new avenues for historical research.
It will also provide a concrete case study through which to explore theories of immersion and presence, leading to important research outputs for the fields of ludomusicology, acoustics, and immersive media.
Our key academic beneficiaries are academics working in and around
- Early Music,
- Sonic interaction design,
- and physical, virtual reality (VR), and augmented reality (AR) computing.
By documenting process and workflow, and generating new acoustic models and sound recordings, we will provide case-study material and data sets for all three groups.
In particular, the exploration of Early Music performance and its relationship to space will provide significant new insights into the understanding of performance practice and generate new research questions that will be the subject of a follow-on funding application.
Our academic partners: the University of Edinburgh and Abertay University
Our academic team will be led by Dr James Cook and Dr Kenny McAlpine (The University of Edinburgh and Abertay University).
Between them, they combine expertise in
- Early Music,
- historically-informed performance practice,
- recording practises,
- as well as having experience of working with museums.
Find out how the project encourages active engagement with academic partners in the three disciplines of Early Music, sound/interaction design, and physical, virtual, and augmented reality computing.
Watch out for our participation at conferences such as the annual Medieval and Renaissance Music (Maynooth 2018) and Ludomusicology (venue TBC) conferences.
We will aim to publish a peer- reviewed article in a leading journal.
Our data sets and case study materials will also be made available.
Questions of immersion, presence, and flow have been central to discussions within videogame studies for a number of years. See, for instance,
- Laura Ermi and Frans Mäyrä, “Fundamental Components of the Gameplay Experience: Analysing Immersion,” in Changing Views: Worlds in Play. Selected Papers of the 2005 Digital Games Research Association’s Second International Conference, eds Suzanne de Castell and Jennifer Jenson (Vancouver: DIGRA and Simon Frzer University, 2005), 15–27;
- Richard M. Ryan, C. Scott Rigby, and Andrew Przybylski, “The Motivational Pull of Video Games: A Self- Determination Theory Approach,” Motivation and Emotion 30:4 (2006): 344–360;
- Paul Skalski and Robert Whitbred, “Image versus Sound: A Comparison of Formal Feature Effects on Presence and Video Game Enjoyment,” PsychNology Journal 8:1 (2010): 67–84;
- Gordon Calleja, In-Game: From Immersion to Incorporation (Massachusetts: MIT Press, 2011);
- Lennart E. Nacke, Sophie Stellmach, and Craig A. Lindley, “Electroencephalographic Assessment of Player Experience: A Pilot Study in Affective Ludology,” Simulation & Gaming 42:5 (2011): 632–655).
The potential of new technologies such as virtual reality has only increased this interest.
At a time when many new games and other software are aiming to maximise immersion, detailed research into its drivers is particularly timely.
As Beth Carroll recently noted (Beth Carroll: ‘Raw Data: Interactive Musical Interplay and Sound Space’, paper given at the Sixth Easter Conference on Video Game Music and Sound, Bath Spa University, 2016) the emergence of Virtual Reality gaming has led to an increasingly ocular-centric approach to immersion and presence in game design.
Our research here will help to resituate music and sound at the centre of developer’s strategies for immersion.