Simulating Synesthesia In Spatially-Based Real-Time Audio-Visual Performance

Gibson, Stephen (2013) Simulating Synesthesia In Spatially-Based Real-Time Audio-Visual Performance. Live Visuals, Leonardo Electronic Almanac, 19 (3). pp. 214-229. ISSN 1071-4391

PDF (Published article)
Gibson_S_-_Simulating_synesthesia.pdf - Published Version

Download (5MB) | Preview
Official URL:


In this paper I will describe and present examples of my live audio-visual work for 3D spatial environments. These projects use motion-tracking technology to enable users to interact with sound, light and video using their body movements in 3D space. Specific video examples of one past project (Virtual DJ) and one current project (Virtual VJ) will be shown to illustrate how flexible user interaction is enabled through a complex and precise mapping of 3D space to media control. In these projects audience members can interact with sound, light and video in real-time by simply moving around in space with a tracker in hand. Changes in sound can be synchronized with changes in light and/or real-time visual effects (i.e. music volume = light brightness = video opacity). These changes can be dynamically mapped in real-time to allow the user to consolidate the roles of DJ, VJ and light designer in one interface. This interaction model attempts to reproduce the effect of synesthesia, in which certain people experience light or color in response to music.

Item Type: Article
Subjects: W200 Design studies
Department: Faculties > Arts, Design and Social Sciences > Design
Related URLs:
Depositing User: Becky Skoyles
Date Deposited: 18 Nov 2013 16:46
Last Modified: 17 Dec 2023 14:31

Actions (login required)

View Item View Item


Downloads per month over past year

View more statistics