home  previous   next 
The Nineteenth Annual Interactive Audio Conference
PROJECT BAR-B-Q 2014
BBQ Group Report: What does an Open DSP environment look like?
   
Participants: A.K.A. "Talking Squids"

Doug Gabel, Intel

David Berol, Audience
Howard Brown, Independent Gerard Andrews, Cadence
Moshe Sheier, CEVA Matthew Cowan, Audience
Facilitator: Linda Law  
 
  PDF download the PDF

Brief Problem Statement

This workgroup discussed and documented some of the key elements of an Open DSP (Digital Signal Processor) solution for audio. This document outlines some of the primary elements that the workgroup felt was key to having a successful Open Audio DSP architecture.  This could be viewed as the preliminary Marketing Requirements Document for an Open DSP product. The problem that we are solving is that it’s difficult or impossible for parties external to specific products, to develop and release IP that can take advantage of integrated or external audio DSPs.

Market Justification

What is the justification and rationale for Open DSP environment?

  1. Audio IP developers can’t currently access digital signal processing capabilities on the various platforms.  Open DSP enables this access.
  2. Enables audio IP developers to improve latency and power by taking advantage of the audio DSP when available.
  3. Abstracts DSP vendors from developers; DSP suppliers/products don’t need to know what IP developers are doing and vice-versa.
  4. Spurs innovation and differentiation within the audio eco-system.
  5. Encourages investment in the development community.
  6. Improves time to market and improves the ability to scale.

Who are the open DSP developers?
  1. Larger OEMs
  2. Research departments in established companies
  3. Pro-audio community
  4. Analog codec mfg
  5. University Researchers
  6. Start-ups
  7. ISVs (Independent Software Vendors)
  8. Transducer vendors
  9. System integrators
Where do different algorithms belong?

Refer to last year’s BBQ report: When is Hardware Offloading preferable now and in the future.

How do developers handle more than one DSP architecture?

When developers code their algorithms, how do they comprehend differences between DSP power, latency, precision, data types, memory, MIPs, etc.

Difficulties & Opportunities: Need some common development language that abstracts the platform level implementation details; Abstractions would be used when DSP IPs are integrated addressing elements like memory/MIPS exhaustion, latencies, data types, and coexistence with other DSP modules. Team recommendation is to consider these elements when the debug, simulation, and profiling tools are engineered. Along with integration tools for final run-time environment.

Cross platform DSP Meta-language

The goal for a cross platform meta-language is to write DSP based SW & IP once and deploy across many different DSP products/types. This would require a skilled team of computer scientists who know DSPs to either adapt an existing meta-language, or create a new language suited to this task.

The team researched a couple of Meta-language options.

  • OSP (Open Sensor Platform) is to have a generic reusable framework to marshal all of the sensor data between the sensor drivers, sensor processing on a sensor hub, sensor processing on the AP and the interface to the Android framework.   It could extrapolate that it could be extended to also shovel audio data along these paths as well, but audio has much different data flow characteristics.  Also, since audio transport is one of the key facilities that a cell phone is built around, it’s been well understood and well abstracted for quite a while.
  • Google does something similar in their Portable Native Client (PNaCl) for Chrome:
  • An additional “meta language” approach regarding SIMD:

This team recommends that a future work group or team solve this problem / define this language.

This team also thinks there are a few companies that would be motivated to help develop and deploy this new language like: Google, Intel, Microsoft, DARPA, and defense contractors.

Needs more definition:

  • How do we handle non-direct audio related functions that could be available in the audio DSP? Like sensors? Could we also abstract elements like presence detect, HRTF, movement, etc…

SDK Requirements

What are the requirements for the Open DSP type product?

    1. Hardware platforms for development
    2. Tech support system
    3. Implementation examples
    4. Meta-language
    5. Developer friendly environment
    6. Legal framework
    7. Robust execution framework (e.g. plug-ins)
    8. Protection and authentication
    9. SDK
      1. Compilers/assemblers/linkers
      2. Debuggers
      3. Profilers
      4. Documentation
      5. Resource management
      6. Integration methodology
      7. Library code
      8. Simulators
      9. Parameter management infrastructure – optional
Hardware Platform for Development

A set of HW development boards needs to be made available for each of the major development targets. Needs to also have appropriate support material for the HW. Other requirements include:

  • Cost effective: needs to be cheap enough for the developer community to easily afford, but expensive enough to weed out non-serious participants.
  • HW Interfaces:
    1.  Interface should be supported to handle debugging with sufficient bandwidth for streaming audio
    2. Digital and analog audio interfaces
    3. Should have expansion slots for addition of other devices using either universal standard connector or connectors/interface standardized by designated development team.
    4. BT capability is preferred, although might be difficult for some devices.
  • Sales Channel: Know your sales channel where the HW platform is sold or distributed. This will be defined by audio DSP implementation owners (e.g. Intel releases development boards, you need to go to Intel for access).
Robust Execution Framework

How do you get two pieces of IP to play nice with each other?

    1. Framework’s purpose is to isolate signal processing elements from the underlying HW and to connect signal processing elements to each other in a standard way.
    2. In addition to supporting the standard signal processing modules or sockets, the environment also needs to supply:
      1. Memory management
      2. Process management
      3. Inter process communication
      4. I/O of audio streams
      5. Parameter interface

Should include robust methods to load/insert processing modules (under development) into the audio chain. Compile, link, load, and debug is an intuitive process.

  • Target is an open source processing execution framework – Possibly adapt OSP.
  • Recommend a plug-in type environment to make the integration transparent to the application/IP developers.
Developer Friendly Environment

How do we create a friendly and easy way for developers to be productive with this SDK and development environment? We should have as many of the following list as possible to facilitate “developer friendly” environment.

  1. A GUI to utilize common signal processing elements
  2. A GUI for configuring the tools
  3. Modern code writing and support editor
  4. Clean SDK installer
  5. Reduced legal burdens in overall SDK product.
  6. Audio specific debugging tools
  7. Real-time tuning and debugging
  8. Robust, real-time graphical profiling
  9. Interface to automated testing and code management tools
  10. HW platform Aware – knows specific relevant and required information about the target implementation platform.
  11. Version independence between HW and SW. The SW development tools are backward compatible.
  12. Interface to MATLAB / Simulink
No time to cover

Other items that are important to mention (but team ran out of time to discuss). Can be follow-on discussion or part of a larger effort that continues after this event.

  • What differentiates Open DSP environments from each other?
  • Can there be multiple Open DSP environments coexisting?

section 5


next section

select a section:
1. Introduction
2. Workgroup Reports Overview
3. Metadata = Money
4. Reinventing the Audio Ecosystem with an Updated Smart-Connector
5. What does an Open DSP environment look like?
6. Interactive Music Creation and Adoption: The Quest Continues!!!
7. Audio opportunities in the Internet of Things
8. Schedule & Sponsors