|
Augmented Reality and Virtual Reality are becoming huge new platforms. How do we adapt to, and take advantage of, the potential of these new technologies to expand how we create, experience and interact with audio and music?
We envision several applications & product concepts that address this opportunity, summarized here and described in-depth in this report :
- Augmented Reality: mixing
- Augmented Reality: control
- Virtual Reality: 3D mixing, literally
- Virtual Reality: customized virtual controllers
- Virtual Reality: next level sound design and performance
- Virtual Reality: music education
Augmented Reality vs. Virtual Reality
Efficiencies of AR/VR
Augmented Reality: mixing
Augmented Reality: control
Virtual Reality: customized virtual controllers
Virtual Reality: 3D mixing, literally
Virtual Reality: next level sound design and performance
Virtual Reality: music education
Conclusion
Appendix: Notes
Appendix: Backup Workgroup Names
With AR, you are experiencing the real world, with MSOT (more shit on top).
With VR, you are completely inside and immersed in a virtual world.
- Equipment is too big, potentially too expensive, and fixed
- End user has access to virtualized gear
- Virtualized gear exists today in software, and for many, is an adequate experience, but if you want to touch that piece of gear today, you have to buy it and store it and lug it around.
- Collaboration
- Offers way to work remotely more effectively
- People are too big
- Puts all controls at arms’ reach. Everything is immediately available in the same room.
- Recording musician can control the DAW from where the instrument is being played.
- Life is too short
- Cost savings for manufacturing? Or are we just moving costs from hardware into software?

- Virtual masking tape on a mixing console
- Visually connect mixer channels with players & sound sources on stage

- Parameter information & ranges
- Virtual MIDI controllers
- Assemble and virtually map software control onto super-cheap (or not!) physical knobs & sliders
- Build custom controllers from household items or existing gear – your kitchen mixer can be an audio mixer!
- Can be way more complex – leaves of a tree? a steering wheel? clay?

- With virtualized hardware, the end user can take a known piece of gear and change any aspect of it to accommodate a particular workflow
- Your virtual controller can represent anything – a tree? a steering wheel? Clay?
- Takes “alternate” controllers to a whole new level
- Haptics, physical and visual feedback cues, would be crucial to the utility of virtual controllers
- Moving sound objects around a 3D space to control gain, pan
- Changing the walls and ceilings and elements of the space = reverb design

- Sound designers now have a Z-axis to play with (Kaos Cube rather than Kaos Pad), as well as 360-degree rotation
- And two hands even! And your head! And your body! And your feets!
- Add motion capture – movement & dance to make / influence / control sound
- The end of laptop musicians?
- Play to virtual audiences & virtual fans
- Create or even customize your virtual audience
- Or, with augmented reality enhance thin crowds with virtual fans
- Teacher and student share a virtual space
- Example:
- Violin teacher demonstrating bow technique
- Student has to (virtually) overlay her arm, hands, fingers on top of the teacher’s
- Haptic feedback can provide another sensor / input into the learning
- shock therapy when they’re wrong
- Popup tutorials for control surfaces
- Virtual ensemble: Play with your dream team
- Virtual dress rehearsal. Bad gig simulator. Disaster training
- Virtual conductor. Create virtual orchestra and rehearse your piece.

The group enjoyed our time exploring the enormous possibilities of both AR and VR as it applies to music and sound. With AR, the ability to add contextually relevant information as well as instruction seems very powerful. With VR, the ability work in spaces that would be impractical otherwise could radically improve productivity and creativity. Finally, the ability to work and play in completely new spaces is just mind blowing and will take time to discover.
Biggest Ideas:
5 year time horizon
Merging past and future
MIDI with attitude - can MIDI provide haptic feedback or do we need another way.
What are the problems we are trying to solve:
Fixed physical controllers - lousy virtual ones - AR/VR + haptic feedback could solve that
Other things than hands that can be used
Good for creation tools, grab tools are arrange them, connect them, limitless patch cords
Everyone has AR/VR for performance - but not necessarily so
Already bad for musicians to get past looking like e-mail reading
Gestures may be resting on legs for endurance - dome for recognition
Visual, aural fatigue
Solves space problem
Depeche Mode show underwhelmed
Skin changes when instrument changes
Holodeck was a shared experience, projection system instead of AR/VR
Initially experience may be superior on conventional display technologies
Audience can plug into an app remotely,
May use a known object in AR
Tangible interfaces - phycons (physical icons)
Vdrums - works with AR
Looks like keyboard, feels like keyboard - keep that element - augment what’s not working
Theremin is hard to play, it’s gestural and doesn’t work well
Quantizing gestures - Rock Bandification
Effort vs. reward curve over time distinguishes instruments from toys - mad player
What does VR/AR give us that flat screens and physical controllers can’t offer
Touch is part of the creative mind - haptics very important
Two classes - entirely virtualized or partly virtualized
Mike Alger
Full body haptic suit, synthesized haptic sub freqs?
Other industries going to solve this for us? Medical, gaming, industrial & military
Group use cases - couch - more than one headset
Using space around us with tangible interfaces
What does 3D buy us - mimics the real world
Mixing a movie while viewing it immersively and doing sound design in 3D. Attach to objects in movie. Existing 2D movie techniques don’t work well in 3D - audio and video must match - no contradictions
Could help conference calls, and facetime, telepresence
Report problem statement: Break it down into various micro interactions (knob, button, slider) better, worse, challenges etc?
How can use the advantages of 3D AR/VR?
Can you approximate a real world experience of equipment interaction in AR/VR?
Young people are buying MI hardware again
MMA streams in VR today
Immersive experience vs. utilitarian improvement of interfaces vs. how could it take us beyond current paradigms
Signal flow could be in 3D vs. planar today
Overlaying context on existing controls, or contextual information
3D lemur-style physical effect control
No consumption - group agreed limitation
i) Sound/music design music, limitless patch cord
ii) Uber controller, mixing deck, effects
Virtualized displays attached to physical object
Add virtual controls to physical objects that turn / switch / slide
3-axis controls [X Y Z], light harps, hand though control
Navigation of controls via “zoomable” access
3-D sounds design in AR - is that tiring - spherical authoring
Gaze detection
3D environments for control
waveform playdough
Gestural macros
Better visual visualization
Multiple engineers can share the same virtual space on a shared resource
If everyone shares the same experience that helps collaboration
Apportunities Knock
Wrestling is Real
Reality Bytes
Virtual Gig
Mr. Unsatisfied
It’s never good enough
Place title here
Player to be named later
Pigs & squirrels
Virtual Unicorn poop
We are all a simulation
The Matrix is here
The Matrix strikes back
Son of the matrix
Unicorns are real
Augmented Unicorns
Augmenticorns
Unicorns are augmented horses
#Unicorns!
Finally, #Unicorns!
Mirage
Virtual Unicorn Poop
#Augmenticorns!
Muse Matrix
section 3 |