I built musical augmented reality experiences.

Minibeats is a project dedicated to harnessing the power and relevance of Augmented Reality to created musical experiences that anyone with a smartphone can enjoy.

Had you asked me five years ago what I envisioned for my career, the phrase "augmented reality" wouldn't crossed my mind.

Yet, when the opportunity to work on this unique extension project with a subset of my Artiphon teammates in collaboration with Snapchat and Warner Music, I eagerly hopped on the AR train.

I quickly went from knowing nothing about augmented reality to designing and scripting fully functional AR experiences using Snap's Lens Studio and JavaScript. I discovered I had great skill in interaction design, which paired beautifully with my extensive musical experience in this unique context.

I led the design of a lens featuring San Holo’s hit song “All The Highs,” which has amassed over 8 million plays to date. 🤯

Ideation

The Question: How can AR technology be employed in a musical context?
The Answer: With AR, anything can become a musical instrument— your body, the furniture, your dog… you name it.

My small team and I started off with a classic whiteboard brainstorm.

We asked ourselves:

  • What everyday gestures could be made musical?

  • How could a user interact with their surroundings in different ways to create music?

Design and Documentation

I took on the role of outlining design principals, leading documentation and specification.

The design principals— conditions that each our lenses needed to meet— included having visual audio reactivity, the ability to be played through AR or through screen taps, and the ability to work with the front or rear camera.

I knew outlining these requirements ahead of time would allow for smooth lens QA in the future.

I created a detailed spec template, stressing to my team that all lenses should be fully spec-ed prior to beginning development. This important step would allow us to maintain clear alignment, with one singular point of reference.

Interaction and UX stood at the core of these spec documents— what the user could do, and what would happen if/when they did it.

Development

With a vast collection of ideas in place and a handful of spec documents written, my team began lens development.

We delegated lenses across the four of us who were familiar with scripting for AR in Lens Studio, keeping all projects and files in a Git repository for organized versioning and collaboration.


User Research

Throughout the development process, a teammate and I took to the streets of Manhattan, offering passers-by the chance to play our lenses in exchange for a $20 bill.

Our goal was to see what would happen if someone came across our experiences on their own when swiping through Snapchat lenses. Would they know what to do?

With real-world user testing, we quickly found that users weren't inclined to interact with AR space. Because of the uniqueness of our project, without being told what to do, users couldn't figure it out on their own, and were inclined to tap on the screen, missing the AR "wow-factor" altogether.

Thanks to our user research, we no knew that we had to implement simple visual and textual "hints" upon lens open to push users towards interacting in AR. With the implementation of hints, our next round of user testing was much more successful.

Before Hints

With Hints

Users didn't discover AR functionality on their own.

Users easily understood that they could interact with lenses in AR space.

Results

With our first lens surpassing 5 million plays in its first month live, it's safe to say that the Minibeats project was a success.

Minibeats eventually grew from a collection of Snapchat lenses to a full iOS app full of interactive musical experiences.