Augmented Body
Role //
Researcher, Prototyper
Duration //
September 2023
Tools //
Motive, Unity(C#), Arduino
This is a space that challenges current paradigms to locate/find objects. Through the process of augmenting sound through volume and duration, I explore how auditory and visual feedback impact our perception and understanding of space.
Problem Space
When attempting to locate an object, many applications rely on sound to help aid the process. During the process, users think about the form of the object, and are guided by sounds to navigate the space. In this project, by augmenting sound through volume and duration, I explore how auditory and visual feedback impact our perception and understanding of space.
Research
Current systems for sound and vision
To begin, I did a journey map of how current applications utilize sound and vision to help users locate objects to better understand how the senses come in to play. Some important points were that:
1. User starts the journey with a clear image of the object that they are looking for in their heads.
2. User utilizes the direction and volume of sound to help them locate the object.
3. User knows when they have found the object once they both see the object and hear the sound emitting from the object.
Journey map of current systems that employ sound and vision to locate objects.
Spatial Audio
Furthermore, to better understand how people associate direction with sounds, I looked in to research done in the Spatial Audio space. Two of my main takeaways were that:
1. People are able to differentiate the direction of sounds very accurately (people are able to differentiate between sounds at around a difference of 5 degrees).
2. With the addition of visuals, people are best able to best locate the object that the sound is emitting from when the object is directly in front of them.
Concept Ideation
Utilizing the current user journey to locate objects and also how to utilize sound to manipulate people’s understanding of space, I began to think of concepts. I wanted to create an experience that distorts a person’s perception of space by changing the current paradigms that exist with regard to sound and distance:
1. People are in a room attempting to locate an object. Surrounding them, however, are four of the same objects.
2. Sounds are emitting from the objects, but the sounds are designed to alter people’s perceptions of distance. When visitors get closer to an object, the sound emitting from the object will become quieter and at longer intervals; however, the sounds from behind them will get louder.
3. It is only when they figure out the patterns to the sounds that they can locate the correct object.
Initial Sketches
When considering the form of the object to be found, the object needed to have a distinguishable outline that vistors have a clear image of in their head. As a result, I landed on the form of an egg—a form that suggests a round shape with a wide base and a sharper top.
Object ideation and prototyping
Two Person Interaction
Tech Diagram
Tech diagrams to demonstrate the technology behind the two people interaction.
Tech Diagram for Two People Interaction
Scenarios for Two People Interaction
Prototyping
Motive to Unity to Arduino
The need to track the location of two people in a space made the use of the spatial lab the best option. For an idea so abstract, I wanted to make it as close to the real experience as possible.
The spatial lab afforded me the use of motion tracking. The idea was that:
1. I would take the locations of the two visitors using motive
2. Calculate their distance to the object in unity
3. From that distance play a corresponding sound at a certain volume from the Arduino
First person controlling volume
Second person controlling duration
First and second person controlling volume and duration
Final Prototype
Single Person Interaction
Tech Diagrams
Tech diagrams to demonstrate the technology behind the single person interaction. From the tech diagrams, it is clear that the single person interaction is much more complicated than the two people interaction.
Tech Diagram for Single Person Interaction
Scenarios for Single Person Interaction
Prototyping
Going about the prototyping for the single person interaction, I attempted to utilize an Arduino, a sonar sensor, and a speaker to attempt to create the experience, this time on a much smaller scale.
The one person interaction turned out to be a lot more complex than the two person interaction. Because spatial audio was hard to show through a video, I thought it would be more effective to show the concept by breaking down the problem into parts, and prototyping using the arduino.
The prototypes below look to explore how sound changes as the visitor moves in between the different axes (incorrect/correct objects, incorrect/incorrect objects).
Moving away from a filler object
Moving in between filler objects
Moving away from correct object
Moving in between filler and correct object
Project Reflection
Thinking about how to change someone’s perception of space was a method of designing that I had never really considered before, but after this project, I understood how powerful designing for the senses was.
Moving forward with the project, I would want to explore enjoyable sounds vs. harsh sounds — this time seeing how sounds can help aid the process of finding rather than interfere.