AR Drawing Tool
Role //
Interaction Designer, Prototyper
Duration //
September — December 2023
Tools //
Unity (C#), Optitrack, Motive, ESP32
A spatial computing application that explores how to bring 2D tools built for the computer screen into spatial computing.
Project Ideation
When thinking about the possibilities of spatial computing, something that interested me was the idea of exploring what combining physical and digital interactions would afford the user. Spatial computing allowed for a whole new set of interactions — for users to both physically and digitally interact with artifacts and move around the space — all things that I could explore.
With this, I wanted to experiment with how I could bring a 2-D interaction into the spatial computing space. The pen tool in many digital applications has a very defined paradigm for how users can use it—but how can bringing it into the spatial computing space add to the interaction?
Background Research
Study of current applications
To start the project, I explored how current applications allow for users to draw. It was interesting to think about how even though the final product exists on the digital screen, the interactions to create it are physical, and take inspiration from the physical act of drawing.
Main Features Explored
I wanted to explore how I could create a drawing application that fully explores the capabilities of Spatial Computing. I explore:
1. The interaction of users bringing their drawings out of the digital screen and into the physical space—and then back to the digital screen.
2. Different ways the user can control colors, brush width/size when drawing in space. I wanted for the interactions to take advantage of the user’s ability to move and interact in the physical space.
3. How haptics can help aid the action of drawing if users are using an external device to draw.
Prototyping
Drawing Interaction
To start this project, I started with creating a simple drawing tool that draws a line on the wall of the studio. I wanted to make sure that the drawing interaction itself was user friendly before thinking about features such as changing color. I began by mocking up the interaction in Unity, and substituting the controller with a cylinder shape. I used a Raycast to see where the controller would be hitting the wall, and created a line from all of the points.
Drawing Application in Unity
After mocking up the interaction in unity, I used Optitrack and Motive to calculate where the location of the controller was in the room.
This way, I would be able to bring the interaction from unity in to the physical space.
Initial Drawing Application Test
It was good progress that I was able to bring the application into space, but the interaction seemed to be rough. Unity was having a hard time processing the position of the Raycast hit, and it seemed be very sensitive to how I rotate and move the controller.
To improve the interaction, it would be interesting to think about how I wanted to change the parameters that determine how sensitive the system is to better aid the action of drawing.
Drawing Application after Recalibration
Final Drawing Interaction
Changing Color Interaction
Next, I prototyped an interaction for how users can change colors. I wanted for users to feel as if there was a physical color palette in front of them, blending the use of both physcial and digital interactions.
Changing Color Interaction in Unity
Changing Color Interaction Iter01
Although this interaction was utilizing tracking, it did not feel like it was taking full advantage of the spatial computing space. The three blocks felt like they were limiting users to move around the space and simulating buttons in a digital interface.
Thus, moving forward, I wanted to think more about how I could take advantage of the physical computing space to seamlessly mesh physical and digital interactions. To encourage more physicality into the interactions, I came up with the idea that to change colors, users would switch an attachment to the pen.
Drawing Interaction Iter02