home  previous   next 
The Ninth Annual Interactive Music Conference
brainstorming graphic

Group Report: A Whole-system Testing Framework for PC Audio

A.K.A. "Marklar"
Gary Johnson; Thomson
Mike D'Amore; Kensei Consulting Keith Weiner; DiamondWare
Paul Bundschuh; Waves Audio, LTD Tony Dal Molin; Audio Precision Inc.
Jeremy Taylor; Outsource Media Devon Worrell; Intel Corp.
Todd Hager; Dolby Labs Licensing Dennis Staats; Dolby Labs Licensing
Andy Lambrecht; SigmaTel, Inc. David Roach; SigmaTel, Inc.
Ethan Schwartz; Analog Devices Paul Perrault; Analog Devices
Glenn Arentzoff; dts Martin Puryear; Microsoft
  Facilitator: David Battino; Batmosphere

Problem Statement

Despite astonishingly good specifications for individual components, PC users’ audio experience lags traditional consumer audio products. Typical audio problems can include fan noise that overpowers speakers, bass response that is overbearing or negligible, clicks that contaminate the signal when the listener drags a window, and mysterious hums and buzzes that creep in without warning.

Today the PC industry does have any tests that reflect the actual PC user experience. Current specifications are only for individual components, not for the complete acoustic system. The audio quality is limited by the weakest link of the system, which is often relatively untested acoustic elements such as loudspeakers and microphones. Although computers are increasingly used as entertainment devices for both audio and video, the audio acoustics are seldom rigorously measured and tested. Furthermore many of the consumer audio industry standard acoustic tests are based on historical reasons that have little correlation on the subjective audio quality perception for today’s users.

Today, there is no industry body that establishes and implements even the most basic quality tests that reflect the user’s audio experience. The team desires to identify and reuse existing measurement methods that logical for this industry and work to develop new standards where no adequate standards exist. In addition the team wishes to identify who in the industry can own, implement, and enforce such tests to improve the quality of the systems shipping to consumers.

Furthermore, traditional analog interfaces to audio peripherals will possibly be replaced in some applications by various types of digital interfaces. This transition will demand new testing methods.


The Marklar team decided to outline a testing framework to test the whole system. These tests will complement current tests that address only individual components of the complete experience. It is important that the “whole system” tests be end-to-end or “in air” acoustic tests. Ideally these tests should include both objective and subjective measurements. With this outline, PC original equipment manufacturers (OEMs) and original device manufacturers (ODMs) will have a tool that helps them develop better audio implementations that raise the bar for the overall user experience. The OEMs will have a vested interest in developing a complete high quality audio solution and in turn driving the ODMs to design and produce higher quality audio components.

The team decided that defining a good/better/best criteria could be used as a marketing tool. A categorization (such as Bronze, Silver, and Gold) could identify increasing degrees of end-to-end audio performance to the consumer. For configurable systems, the OEM could use the rating system to identify point-of-sale options such that choosing, say, microphone A results in a Silver System and choosing microphone B results in a Gold System. An important goal is to increase up-sell revenues to the PC OEM and higher average sale prices (ASPs) to the component manufacturer.

A variety of testing scenarios are needed match the variety of ways people use audio on computers. The table below summarizes the applications and the general testing methodologies that are required.

1Create guidelines for microphone placement, e.g., facing user, not on hard disk drive, input level wizard.
2Provide for additional testing of shipped applications, i.e. dts, DD, WMA, mp3, etc.

Action Items

  • Form an industry group (David Roach — Lead, Devon, Paul B , Ethan, Tony, Keith, Todd, Martin or other Microsoft person) that will:
    • Identify existing industry tests that can used to evaluate whole system performance
    • Identify new tests where existing tests are either insufficient or non-existent
    • Quantify the test results that are required to meet the various marketing levels
  • Identify and work with industry organizations that have the power to promote and certify products based on the defined tests

section 5

next section

select a section:
1. Introduction  2. Speakers  3. Executive Summary  
4. Mobile Phone Audio - Lessons Learned from Games and the Web
5. A Whole-system Testing Framework for PC Audio
6. The Stroke-a-phone: A New Digital Instrument for Troubled Times
7. MIFFED (Music Industry Foundation for Educational Development)
8. PRAGMA (Pet Rocks and Game Music Alliance)
9. Schedule & Sponsors