top of page

Perform music using Soundverse AR

Soundverse is an AR based music performance environment with AR instruments which are MIDI enabled and visual music lenses that respond to music being played. Done as a passion summer project.

“How might we enable on-the-go hobby musicians to create music without carrying instruments in an immersive way?”

Overview

Concept Video

Process
 

In summer of 2018, I travelled extensively across multiple countries. I always felt the need to compose music on-the-go, but taking studio with me was a big challenge. I felt that currently available products have limitations such as size, weight and they're not so mobile. So, I took a side passion-project to build something to solve this problem and built an extended reality based performance environment called Soundverse. This project was conceptualised, designed and developed by me and later tested on musicians.

Limitations in current music making products from the point of view of an on-the-go musician

top music creation problems

Persona of "On-the-go part time musician"

•   Has a day job and plays music for fun and passion


•   Often likes to collaborate with various artists and fill the gap or improve the existing track


•   Invests in stock and often new music gear since they have a stable source of income


•   Enthusiast and Visionary: Someone who would use a new product which is not yet commercially available and develop an opinion based on that experience. If liked, would spend progressively on the product and not at once.

tech user persona

Setting the scope

 

•   Choosing the medium: Asked question like, "How can I reach the most number of people?", "Which AR Medium has as less audio latency as possible", "How do I make this vivid vision real?", "What are the most important problems to be solved?", "What are the practical limitations  to building an augmented reality based product?"— Asking these questions helped me zero down on the medium that could reach atleast a couple million users. ARKit and iOS boasts a user base of 62 million people, hence I  chose to go ahead with the Mobile AR as a medium. 


•   Choosing the direction: Took the ultimate vision of building a mixed reality based digital audio workstation with hand tracking; and narrowed down the scope to a direction where a part of it can be realized today.

Design
 

Principles

•   Pluggability to existing music-making ecosystem
•   Familiarity
•   Easy to Experience
•   Shareable

principles of ar music creation

Mobile AR has two layers into it. Both need to follow the same design language and often one layer affects the other one.


•   AR World Design : World design is about the visual language, affordance and behaviour definition of 3d components. These 3d components need to be designed keeping in mind that the user gets roughly 6 inches of real-state on mobile phone to view and interact with them. Another thing to keep in mind is whether the experience is going to be one hand only, or both hand. I chose to focus on one hand experience; hence instruments were designed accordingly.


•   Mobile App - Experience Design : Mobile App design constitutes already defined UX Patterns. Visual design, navigation design, information architecture, user on-boarding and main app experience etc. In this layer, another crucial part was to design the music-performance workflow.
On top of these 2 layers is the layer of "Music making workflow" which informs what value needs to be provided to the "on-the-go hobby musician" within the context of Mobile AR where screen real-state is a big design constraint. 

AR World Design and building a proof of concept

environment variable in AR

In order to design the AR component of concept, I took the approach of using 4 pillars of AR UX in order to come with a wholesome experience:

1.  Environments
2.  Interface
3.  Interactions
4.  Navigation

Learning while working on the world design:


•   Platform directly influences the design of 3d models — For mobile, designed instruments in such a way that it can be used by just one-hand (As one hand is usually occupied in handling the phone), elevated the face of instrument to 45 degrees to increase tap-ability. Same instrument would look significantly different for AR headset. 
•   How to make it look relatable and interesting - Some mobile AR experiences tend to look too futuristic and a big chunk of users find it difficult to relate to such visuals. I wanted to use a visual language which brings a touch of human element to the experience. I tried various styles and used products semantics. The general theme I narrowed down to was semantically more or less vintage-y, retro and human. 

Mobile App - Experience Design

Information Architecture

App Navigation Flow - Soundverse

UX Navigation Flow

App Navigation Flow - Soundverse

Final Design

Try the UX Prototype

Testing

•   Tested the app with over 200 musicians and let them use the app for music making process. People found the music lenses the most distinguishable experience.


•   Inbuilt AR-Synthesizer was used the most and seen as a breakthrough in musical instruments, specially making new and unique sound was seen as much more easier and desirable.


•   People found it cool, yet uncomfortable to use over a long period of time. M1 retention was <75%.


•   Launching Clips of Ableton using the App was seen as a desirable feature. 

Hi, I'm Sourabh Pateriya.

Currently I'm building Soundverse Inc. that I founded in June'23. I'm a Product leader who's led teams at Spotify, Samsung and Tobii with experience in the areas of generative AI, music-tech, extended reality, eye-tracking, voice assistant and mobile. I hold 10+ patents.

bottom of page