As high-density loudspeaker arrays (HDLAs) and 3D audio technologies gain prominence in concert venues, the role of spatial audio in live performances continues to evolve. This thesis investigates the intersection of live performance and spatial audio, focusing on the design and development of a novel controller for intuitive sound spatialisation. Drawing on theories of sound reproduction and spatialisation, the project explored both historical practices and recent innovations.
Significant emphasis was placed on the development process, with detailed attention given to the iterative steps and decisions that shaped the controller’s design and functionality. It integrates Ambisonics, a widely used spatial audio method, while leveraging MIDI and OSC protocols to ensure both flexibility and adaptability. The resulting controller seeks to combine tactile interaction with extensive configurability, enabling dynamic spatial control and encouraging creative exploration. By blending artistic and technical perspectives, this thesis strives to contribute to the development of tools and methods that make spatialisation in 3D audio performances more intuitive and accessible to a wider range of users.
Based on the insights gained, a dynamic web application was developed using modern web technologies like React and Next.js, inviting users to participate actively and presenting the collected data in an interactive infographic. The work concludes with detailed project documentation and an evaluation of user experiences, providing valuable insights into the effectiveness and future development potential of the platform.