home  previous   next 
The Eighteenth Annual Interactive Audio Conference
PROJECT BAR-B-Q 2013
BBQ Group Report: Enabling More Profound Human Expression with Modern Musical Instruments
   
Participants: A.K.A. "Electric Cicadas"

Aaron Higgins, Native Instruments

Alan Kraemer, Taida
Alex Westner, iZotope Andrew Rumelt, Cirrus Logic
Dan Bogard, IDT Fabien Noel, Ubisoft
Jeremiha Douglas, Dolby Jordan Rudess, Wizdom Music
Jory Prum, studio.jory.org Randy Granovetter, Taida
Facilitator: David Battino, Batmosphere  
 
  PDF download the PDF

Problem statement

Technology has advanced considerably since the renaissance. Many instruments are still based on old technology such as pianos, guitars, and what not. We can use new technologies to make more expressive, accessible, and better sounding musical instruments. We can make musical instruments that provide a more efficient and effective conduit from brain to audience – more organic and connected with the human experience than ever before. Traditional electronic instruments do not radiate in space like an acoustic instrument; modern technologies can improve spatialization.

The Anatomy of a Musical Instrument

We defined the following components/elements that describe how instruments work – from the artistic concept in the musician’s brain to the audience’s experience:

1) Control

  1. The physical interface to the musician
  2. Expression: what musicians want to be able to convey through their instrument.
  3. A fat, fast pipe with which to transmit your emotions through your instrument, so that what you hear in your head is what you play out.
  4. Visuals & feedback: we need to provide relevant feedback to the musician
    1. Colors are vital to get right – all black is hard to see on a dark stage.
    2. Rich displays, with haptic feedback?
    3. Holograms?

2) Connectivity

  1. Left brain / right brain problem: musicians do not necessarily want to be technicians, and the current state of connectivity (generic MIDI controllers -> generic DAW -> myriad virtual instruments) kills creativity, and limits or stifles expression.
  2. If an instrument must be modular (separate controller + sound generator), know that the goal for the musician is for the instrument to feel complete, where the sound generator feels directly coupled and perfectly matched with the controller.
  3. Connections between controllers and sound generators should be bidirectional and discoverable.

3) Interpretation layer

  1. What to do with control and note inputs before we generate sound.
  2. Some “assistance” may be required or desired, such as pitch quantization, control quantization, scales & pattern recommendations
    1. Could even go so far as to guide the musician within a particular style/genre, or even more specifically, to directly emulate a well-known musician within a genre.

4) Sound generation

  1. When the sound generator is more tightly integrated with the control layer and the interpretation layer:
    1. There is zero, or near-zero setup for the musician.
    2. It is easier to achieve the goals of the musician, by means of a more integrated design.
  2. A next generation synthesizer should be able to create next generation sounds – what are we doing with all these new expressive controls and interfaces?!
    1. We tend to buy and use the same sound palettes created at the dawn of synthesis.
      1. Cover bands, wedding bands, are still going first to bread-and-butter sound libraries.
      2. Music manufacturers often stick to old standby synth technologies (analog modeling, subtractive, etc.), because it’s what sells.
    2. Examples of relevant modern milestones re: creating new sounds
      1. Fender invented the electric guitar (or brought it to the masses)
      2. Hammond organ was a completely new sound in 1935

5) Sound emission

  1. Traditional electronic instruments do not radiate in space like an acoustic instrument
  2. Modern technologies can, at a minimum, improve spatialization.
  3. Beyond a simple amp/loudspeaker, in what ways can a modern electronic instrument emit sound, given a deeper awareness of the musician’s expression, and the digitally generated sound?

6) Multiple Personalities

  1. In order to realize an effectively expressive musical instrument, we must consider the fact that expression may vary significantly across individuals.
  2. How much should the instrument respond and adapt to each individual?
  3. Can an individual’s artistic profile be stored in the cloud, so that he/she can essentially “log in” to any instrument and have that instrument recognize and adapt its settings and sounds for that particular musician?
  4. The instrument must be versatile enough so that individuals can creatively differentiate themselves with it, where a variety of virtuosos may emerge, each with his/her own signature sound.

References

Project Bar-B-Q 2001
The Compellerheads
A compelling music-making system for the living room console

Project Bar-B-Q 2004
Stroke-a-phone
The stroke-a-phone: a new digital instrument for troubled times

section 5


next section

select a section:
1. Introduction
2. Workgroup Reports Overview
3. Ubiquitous Networked Audio
4. HD Audio Capture in Consumer Devices
5. Enabling More Profound Human Expression with Modern Musical Instruments
6. Using Sensor Data to Improve the User Experience of Audio Applications
7. When is Hardware Offloading Preferable, Now and in the Future?
8. Schedule & Sponsors