The Ninth Annual Interactive
Music Conference PROJECT BAR-B-Q 2004 |
![]() |
Group Report: A Whole-system Testing Framework for PC Audio |
Participants: A.K.A. "Marklar" |
Gary Johnson; Thomson |
Mike D'Amore; Kensei Consulting | Keith Weiner; DiamondWare |
Paul Bundschuh; Waves Audio, LTD | Tony Dal Molin; Audio Precision Inc. |
Jeremy Taylor; Outsource Media | Devon Worrell; Intel Corp. |
Todd Hager; Dolby Labs Licensing | Dennis Staats; Dolby Labs Licensing |
Andy Lambrecht; SigmaTel, Inc. | David Roach; SigmaTel, Inc. |
Ethan Schwartz; Analog Devices | Paul Perrault; Analog Devices |
Glenn Arentzoff; dts | Martin Puryear; Microsoft |
Facilitator: David Battino; Batmosphere | |
Problem Statement Despite astonishingly good specifications for individual components, PC users’ audio experience lags traditional consumer audio products. Typical audio problems can include fan noise that overpowers speakers, bass response that is overbearing or negligible, clicks that contaminate the signal when the listener drags a window, and mysterious hums and buzzes that creep in without warning. Today the PC industry does have any tests that reflect the actual PC
user experience. Current specifications are only for individual components,
not for the complete acoustic system. The audio quality is limited by
the weakest link of the system, which is often relatively untested acoustic
elements such as loudspeakers and microphones. Although computers are
increasingly used as entertainment devices for both audio and video, the
audio acoustics are seldom rigorously measured and tested. Furthermore
many of the consumer audio industry standard acoustic tests are based
on historical reasons that have little correlation on the subjective audio
quality perception for today’s users. Furthermore, traditional analog interfaces to audio peripherals will
possibly be replaced in some applications by various types of digital
interfaces. This transition will demand new testing methods. Solutions The Marklar team decided to outline a testing framework to test the whole system. These tests will complement current tests that address only individual components of the complete experience. It is important that the “whole system” tests be end-to-end or “in air” acoustic tests. Ideally these tests should include both objective and subjective measurements. With this outline, PC original equipment manufacturers (OEMs) and original device manufacturers (ODMs) will have a tool that helps them develop better audio implementations that raise the bar for the overall user experience. The OEMs will have a vested interest in developing a complete high quality audio solution and in turn driving the ODMs to design and produce higher quality audio components. The team decided that defining a good/better/best criteria could be used as a marketing tool. A categorization (such as Bronze, Silver, and Gold) could identify increasing degrees of end-to-end audio performance to the consumer. For configurable systems, the OEM could use the rating system to identify point-of-sale options such that choosing, say, microphone A results in a Silver System and choosing microphone B results in a Gold System. An important goal is to increase up-sell revenues to the PC OEM and higher average sale prices (ASPs) to the component manufacturer. A variety of testing scenarios are needed match the variety of ways people use audio on computers. The table below summarizes the applications and the general testing methodologies that are required.
Action Items
|
Copyright 2000-2014, Fat Labs, Inc., ALL RIGHTS RESERVED |