top of page


interaction design

an artistic music manipulation app that provides a collaborative environment to express yourself and listen to music together with friends when you’re apart.

Mockup of WAVE logo on an iPhone screen


Our team was pushed to imagine what capabilities were unused in an everyday phone camera and how we could integrate these technologies into augmented reality. This entire project took place remotely, so our team meetings were all online. This was a huge challenge that really impacted our early stages of the process that involved a lot of collaboration and ideation. 


Monica Ionescu

Brayan Jimenez

Rebecca Rhee

Yuna Shin

Joyce Lin

made for

Music lovers

COVID isolaters

my role

UI/UX Designer


3 weeks

Fall 2020

How can we take advantage of AR technology that is readily available in our cameras and turn it into an interactive experience?


Our collaboration began on Figma, with the goal that any idea goes. We found a few key themes during this stage that we were interested in. Those categories were surveillance, storytelling/sensory experience, information & safety, and AR guidance. We eventually landed on storytelling/sensory experience because we thought that it would allow for the most creative freedom and fun in the process. 


We were drawn to sound and the power it can hold with memories and nostalgia. We wanted to create an integrated sound experience. In order to fuel our idea, we created a storyboard where a motivation for use is visible. 

sketch split in half; a person is listening to music in bed (left) and an individual doing work at a desk (right)
Sketch of an individual picking up their phone and putting in headphones

1. Jo is doing work on their computer with their

    phone nearby. The phone lights up. 

2. Jo opens the app and puts their AirPods in. 

Sketch of an individual with headphones in points their phone camera to scan their room

3. The app prompts Jo to point the phone camera

    at the environment while emitting music. 

Sketch split in half, both sides show users pointing their phone cameras to their environment. The phone has sound waves that seem to interact with the environment
Sketch split in half; the left side shows a hand interacting with sound waves on a phone screen. The right shows an individual seeing the interaction on a different phone
Sketch split in half; the left side shows a hand interacting with sound waves on a phone screen. The right shows an individual seeing the interaction on a different phone

4. Billie, who invited Jo to use the app with him,

    also points his phone into the environment. The

    app begins visualizing the sounds of the song

    they are listening together. 

5. Billie triggers the manipulation of a sound wave

    using the app. Jo's phone reflects this trigger. 

6. Both continue to manipulate audio

    accompanied by the AR visualizations. The

    synced sound visuals projected into their space

    interacts with the objects in their respective


A sketch of two phone screens - one reads "save" while the other reads "recent"

7. After the session ends, a recording of their

    visuals can be saved for future playback. 



One of the most difficult considerations to make during this project was how the user would actually interact with the product. We had to think of gestures that were possible while the user is holding the phone, and simultaneously intuitive. We had to consider which aspects of sounds that the user could manipulate and how to provide accurate feedback for those changes. With that in mind, we came up with the following gestures and how they would affect the sound. 

5 hands outlining motions for interacting with the Wave App. This includes position, volume, beat and duration, pitch, and timeline edits.


Our timeline for this project was incredibly short. Not only did we need to develop a design system, but we also needed to learn how to render in 3D and create a video presentation of the work. We split up the tasks between the team members -- Joyce, Rebecca, and I worked on the design system for the app itself while Yuna and Brayan completed the animation and video portion of the project. 


We moved quickly through the prototyping process. We came in with some sketches and general ideas of the user flow but jumped right into prototyping on Figma. We didn't have much time, so we needed to get to work right away, generating our ideas quickly and communicating them efficiently. We grounded the wireframes in the narrative we had created earlier. This helped us with focusing on the key features and interactions of the app. 

Rough mockup of a glowing orb on a phone camera which faces an office space
Rough mockup of three glowing orbs on a phone camera which faces an office space
Rough mockup of changing song time with three glowing orbs on a phone camera which faces an office space
"Save Session" pop up on a phone screen with three glowing orbs and an office space in the background

High-Fidelity Wireframes

Our final design proposal, like the wireframes, was developed very quickly. We had minimal time to put this together. Below are a few slides that represent the key visual designs of the app. The video showcases how the application is intended to be used. The full video can be found here.

Phone screen that shows how to invite friends to use the Wave app
Phone screen which shows how music is visualized in the AR world of Wave
Phone screen which shows how to add sound effects to a jam session
phone screen that shows when someone is virtually interacting with one of the AR orbs on the screen
Phone screen with the interactive hand gestures outlined
Phone screen with the Wave feed showing collaborative sessions between friends
Phone screen with images of previous sessions labeled "your gallery"


Overall, this project was really fun and let us all be very creative in the process. It was difficult figuring out how to transfer teamwork to a fully remote experience. There were difficulties with technologies and timing that wouldn't have normally been an obstacle. We also were very rushed during this process. I would have liked to do some user research on gestures and colors. None of us have had experience with 3D renderings. That is also something we would have liked more time to explore our options for generating 3D objects. 

bottom of page