Proof of concept video
I tend to go to as many music concerts as possible. Last year while attending Rock am Ring, I felt that the visual engagement through lights and pyrotechnics is limited around stage area only. Can this problem of limited visual engagement be solved by XR?
Focusing on one specific problem and making a problem statement
There were various ways to tackle this problem. For e.g. elevating the experience from the audience's p.o.v, performer's p.o.v. or both. I decided to design experience in such a way that both audience and performer feel more visually engaged.
I chose a very specific context of live performance. Electronica artists tend to use a pre-decided workflow which consists of live music, instruments, audio effects, automated mixing and visuals. Let's call this artist, "Apex".
Apex likes to use Ableton Push as a live music instrument which connects to a laptop. She also uses a 3rd party plugin to generate some cool real-time visuals which are usually projected on a big screen. Since big screen is usually behind her, she misses all the visuals and some key moments that audience would connect to and enjoy. Is there a way to bring visualisation to Apex in a easy way?
Music Lens: To make audience's experience more immersive, introduce a visualisation which is in sync with the music. I chose wave as a basic visualisation because it communicates the bpm of a song really well. (Also it's pretty easy to make it. ssshhhh...)
Virtual Screens: To allow performer to see the state of DAW (Ableton Live) and visualisation. For this part, I needed an object anchor to which augmented information can hook to. Since a performer usually have their unique looking instruments, I decided to use Ableton Push 2 (instrument) as an object anchor to augment visual layer on top of it.
To sketch 3d XR ideas, I use print out of axonometric isometric projection and draw various ideas on it. Here I drew a rough arrangement of virtual screens and also how the music lens would look like.
Out of the options A and B for Virtual Screen, option A made more sense as we don't want a virtual screen to come between the performer and the audience.
For the Music Lens, I felt if it originates from the instrument itself and then creating ripples all across the live arena, it'll communicate the deeper meaning that the music performed through that instrument is propagating throughout the arena like a sound wave.
Making point cloud of the target object (Ableton Push 2 music instrument):
Building virtual screens as a prefab in Unity:
Creating waves using particle editor: