Soundspotting
The idea for this project sprung from a personal interest in music/spatial sound, and a chance discussion about the same that I had with a Prof. Jayesh Pillai at the IDC, who works in the areas of Virtual and Augmented Reality experience design. At the time it was just a question about what new advances in our ability to digitally recreate sound environments could possibly be used for, and more specifically, how AR for sound could potentially hold its own value even when delinked from the more visual aspects that are prevalent today.
Passing this question through a course project on Interaction Media and Senses (that asked us to design sensory experiences to foster a sense of community among people) led to an initial direction that was then explored as ‘Soundspotting’. In the months that followed I looked at the ways this could be achieved by implementing Audio Augmented Reality. Together with two of my classmates - Sai Anjan and Prachi Tank - we were able to get a fairly well developed prototype up and running, that allowed people to experience what we termed an ‘Interactive Social Soundscape’. We use an Arduino controller to wirelessly send head orientation information to an application created using the Unity3D game engine. Resonance Audio, a spatial sound API was implemented within Unity to create the soundscapes.
The Hardware prototype looked like this:
The visual setup within Unity (for the soundscape):
A short video describing the project is below: