audio data api - mozillawiki

Upload: berni97

Post on 04-Apr-2018

231 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 Audio Data API - MozillaWiki

    1/14

    Skip to Search

    Skip to Navigation

    Skip to Sub Navigation

    Skip to Content

    MozillaWiki

    Page

    Discussion

    View source

    History

    Navigation

    Main page

    Community portal

    Mozilla News

    Recent changes

    Random page

    Help

    Personal tools

    Log in / create account

    Toolbox

    What links here

    Related changes

    Special pages

    Browse properties

    Audio Data API

    Home Audio_Data_API

    From MozillaWiki

    Contents

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    2/14

    1 Defining an Enhanced API for Audio (Draft Recommendation)

    1.1 Abstract

    1.2 Authors

    1.3 Other Contributors

    2 Standardization Note

    3 API Tutorial

    3.1 Reading Audio

    3.2 Complete Example: Visualizing Audio Spectrum

    3.3 Writing Audio

    3.4 Complete Example: Creating a Web Based Tone Generator

    4 DOM Implementation

    4.1 nsIDOMNotifyAudioAvailableEvent

    4.2 nsIDOMHTMLMediaElement additions

    4.3 nsIDOMHTMLAudioElement additions

    4.4 Security

    4.5 Compatibility with Audio Backends

    5 Additional Resources

    5.1 Bug5.2 Obtaining Builds

    5.3 JavaScript Audio Libraries

    5.4 Working Audio Data Demos

    5.5 Demos Needing to be Updated to New API

    5.6 Third Party Discussions

    Defining an Enhanced API for Audio (Draft Recommendation)

    Abstract

    The HTML5 specification introduces the and media elements, and with them the opportunity

    to dramatically change the way we integrate media on the web. The current HTML5 media API provides ways

    to play and get limited information about audio and video, but gives no way to programatically access or create

    such media. We present a new Mozilla extension to this API, which allows web developers to read and write

    raw audio data.

    Authors

    David Humphrey (@humphd)

    Corban Brook (@corban)Al MacDonald (@F1LT3R)

    Yury Delendik

    Ricard Marxer (@ricardmp)

    Charles Cliffe (@ccliffe)

    Other Contributors

    Thomas Saunders

    Ted Mielczarek

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    3/14

    Standardization Note

    Please note that this document describes a non-standard experimental API. This API is considered deprecated

    and may not be supported in future releases. The World Wide Web Consortium (W3C) has chartered the Audio

    Working Group to develop standardized audio API specifications, including Web Audio API. Please refer to the

    Audio Working Group website for further details.

    API Tutorial

    This API extends the HTMLMediaElement and HTMLAudioElement (e.g., affecting and ),

    and implements the following basic API for reading and writing raw audio data:

    Reading Audio

    Audio data is made available via an event-based API. As the audio is played, and therefore decoded, sample

    data is passed to content scripts in a framebuffer for processing after becoming available to the audio

    layer--hence the name, MozAudioAvailable. These samples may or may not have been played yet at the time of

    the event. The audio samples returned in the event are raw, and have not been adjusted for mute/volume settingson the media element. Playing, pausing, and seeking the audio also affect the streaming of this raw audio data.

    Users of this API can register two callbacks on the or element in order to consume this data:

    var audio = document.getElementById("audio");

    audio.addEventListener('MozAudioAvailable', audioAvailableFunction, false);

    audio.addEventListener('loadedmetadata', loadedMetadataFunction, false);

    The loadedmetadata event is a standard part of HTML5. It now indicates that a media element (audio or video)has useful metadata loaded, which can be accessed using three new attributes:

    mozChannels

    mozSampleRate

    mozFrameBufferLength

    Prior to the loadedmetadata event, accessing these attributes will cause an exception to be thrown, indicating

    that they are not known, or there is no audio. These attributes indicate the number of channels, audio sample

    rate per second, and the default size of the framebuffer that will be used in MozAudioAvailable events. This

    event is fired once as the media resource is first loaded, and is useful for interpreting or writing the audio data.

    The MozAudioAvailable event provides two pieces of data. The first is a framebuffer (i.e., an array) containing

    decoded audio sample data (i.e., floats). The second is the time for these samples measured from the start in

    seconds. Web developers consume this event by registering an event listener in script like so:

    var audio = document.getElementById("audio");

    audio.addEventListener('MozAudioAvailable', someFunction, false);

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    4/14

    An audio or video element can also be created with script outside the DOM:

    var audio = new Audio();

    audio.src = "song.ogg";

    audio.addEventListener('MozAudioAvailable', someFunction, false);

    audio.play();

    The following is an example of how both events might be used:

    var channels,

    rate,

    frameBufferLength,

    samples;

    function audioInfo() {

    var audio = document.getElementById('audio');

    // After loadedmetadata event, following media element attributes are known:

    channels = audio.mozChannels;

    rate = audio.mozSampleRate;

    frameBufferLength = audio.mozFrameBufferLength;

    }

    function audioAvailable(event) {

    var samples = event.frameBuffer;var time = event.time;

    for (var i = 0; i < frameBufferLength; i++) {

    // Do something with the audio data as it is played.

    processSample(samples[i], channels, rate);

    }

    }

    Complete Example: Visualizing Audio Spectrum

    This example calculates and displays FFT spectrum data for the playing audio:

    JavaScript Spectrum Example

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    5/14

    var canvas = document.getElementById('fft'),

    ctx = canvas.getContext('2d'),

    channels,

    rate,

    frameBufferLength,

    fft;

    function loadedMetadata() {

    channels = audio.mozChannels;

    rate = audio.mozSampleRate;

    frameBufferLength = audio.mozFrameBufferLength;

    fft = new FFT(frameBufferLength / channels, rate);

    }

    function audioAvailable(event) {

    var fb = event.frameBuffer,

    t = event.time, /* unused, but it's there */

    signal = new Float32Array(fb.length / channels),

    magnitude;

    for (var i = 0, fbl = frameBufferLength / 2; i < fbl; i++ ) {

    // Assuming interlaced stereo channels,

    // need to split and merge into a stero-mix mono signal

    signal[i] = (fb[2*i] + fb[2*i+1]) / 2;

    }

    fft.forward(signal);

    // Clear the canvas before drawing spectrum

    ctx.clearRect(0,0, canvas.width, canvas.height);

    for (var i = 0; i < fft.spectrum.length; i++ ) {

    // multiply spectrum by a zoom value

    magnitude = fft.spectrum[i] * 4000;

    // Draw rectangle bars for each frequency bin

    ctx.fillRect(i * 4, canvas.height, 3, -magnitude);

    }

    }

    var audio = document.getElementById('audio-element');

    audio.addEventListener('MozAudioAvailable', audioAvailable, false);

    audio.addEventListener('loadedmetadata', loadedMetadata, false);

    // FFT from dsp.js, see below

    var FFT = function(bufferSize, sampleRate) {

    this.bufferSize = bufferSize;

    this.sampleRate = sampleRate;

    this.spectrum = new Float32Array(bufferSize/2);

    this.real = new Float32Array(bufferSize);

    this.imag = new Float32Array(bufferSize);

    this.reverseTable = new Uint32Array(bufferSize);

    this.sinTable = new Float32Array(bufferSize);

    this.cosTable = new Float32Array(bufferSize);

    var limit = 1,

    bit = bufferSize >> 1;

    while ( limit < bufferSize ) {for ( var i = 0; i < limit; i++ ) {

    this.reverseTable[i + limit] = this.reverseTable[i] + bit;

    }

    limit = limit > 1;

    }

    for ( var i = 0; i < bufferSize; i++ ) {

    this.sinTable[i] = Math.sin(-Math.PI/i);

    this.cosTable[i] = Math.cos(-Math.PI/i);

    }

    };

    FFT.prototype.forward = function(buffer) {

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    6/14

    var bufferSize = this.bufferSize,

    cosTable = this.cosTable,

    sinTable = this.sinTable,

    reverseTable = this.reverseTable,

    real = this.real,

    imag = this.imag,

    spectrum = this.spectrum;

    if ( bufferSize !== buffer.length ) {

    throw "Supplied buffer is not the same size as defined FFT. FFT Size: " + bufferSize + " Buffer Si

    }

    for ( var i = 0; i < bufferSize; i++ ) {

    real[i] = buffer[reverseTable[i]];

    imag[i] = 0;

    }

    var halfSize = 1,

    phaseShiftStepReal,

    phaseShiftStepImag,

    currentPhaseShiftReal,

    currentPhaseShiftImag,

    off,

    tr,

    ti,

    tmpReal,

    i;

    while ( halfSize < bufferSize ) {

    phaseShiftStepReal = cosTable[halfSize];

    phaseShiftStepImag = sinTable[halfSize];

    currentPhaseShiftReal = 1.0;

    currentPhaseShiftImag = 0.0;

    for ( var fftStep = 0; fftStep < halfSize; fftStep++ ) {

    i = fftStep;

    while ( i < bufferSize ) {

    off = i + halfSize;

    tr = (currentPhaseShiftReal * real[off]) - (currentPhaseShiftImag * imag[off]);

    ti = (currentPhaseShiftReal * imag[off]) + (currentPhaseShiftImag * real[off]);

    real[off] = real[i] - tr;

    imag[off] = imag[i] - ti;

    real[i] += tr;

    imag[i] += ti;

    i += halfSize

  • 7/31/2019 Audio Data API - MozillaWiki

    7/14

    // Create a new audio element

    var audioOutput = new Audio();

    // Set up audio element with 2 channel, 44.1KHz audio stream.

    audioOutput.mozSetup(2, 44100);

    mozWriteAudio(buffer)

    // Write samples using a JS Array

    var samples = [0.242, 0.127, 0.0, -0.058, -0.242, ...];var numberSamplesWritten = audioOutput.mozWriteAudio(samples);

    // Write samples using a Typed Array

    var samples = new Float32Array([0.242, 0.127, 0.0, -0.058, -0.242, ...]);

    var numberSamplesWritten = audioOutput.mozWriteAudio(samples);

    mozCurrentSampleOffset()

    // Get current audible position of the underlying audio stream, measured in samples.

    var currentSampleOffset = audioOutput.mozCurrentSampleOffset();

    Since the MozAudioAvailable event and the mozWriteAudio() method both use Float32Array, it is possibleto take the output of one audio stream and pass it directly (or process first and then pass) to a second:

    var a1 = document.getElementById('a1'),

    a2 = new Audio(),

    buffers = [];

    function loadedMetadata() {

    // Mute a1 audio.

    a1.volume = 0;

    // Setup a2 to be identical to a1, and play through there.a2.mozSetup(a1.mozChannels, a1.mozSampleRate);

    }

    function audioAvailable(event) {

    // Write the current framebuffer

    var frameBuffer = event.frameBuffer; // frameBuffer is Float32Array

    writeAudio(frameBuffer);

    }

    a1.addEventListener('MozAudioAvailable', audioAvailable, false);

    a1.addEventListener('loadedmetadata', loadedMetadata, false);

    function writeAudio(audioBuffer) {

    // audioBuffer is Float32Array

    buffers.push({buffer: audioBuffer, position: 0});

    // If there's buffered data, write thatwhile(buffers.length > 0) {

    var buffer = buffers[0].buffer;

    var position = buffers[0].position;

    var written = a2.mozWriteAudio(buffer.subarray(position));

    // // If all data wasn't written, keep it in the buffers:

    if(position + written < buffer.length) {

    buffers[0].position = position + written;

    break;

    }

    buffers.shift();

    }

    }

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    8/14

    Audio data written using the mozWriteAudio() method needs to be written at a regular interval in equal

    portions, in order to keep a little ahead of the current sample offset (the sample offset that is currently being

    played by the hardware can be obtained with mozCurrentSampleOffset()), where a little means something on

    the order of 500ms of samples. For example, if working with 2 channels at 44100 samples per second, a writing

    interval of 100ms, and a pre-buffer equal to 500ms, one would write an array of (2 * 44100 / 10) = 8820

    samples, and a total of (currentSampleOffset + 2 * 44100 / 2).

    It's also possible to auto detect the minimal duration of the pre-buffer, such that the sound is played without

    interruptions, and lag between writing and playback is minimal. To do this start writing the data in small portions

    and wait for the value returned by mozCurrentSampleOffset() to be more than 0.

    var prebufferSize = sampleRate * 0.020; // Initial buffer is 20 ms

    var autoLatency = true, started = new Date().valueOf();

    ...

    // Auto latency detection

    if (autoLatency) {

    prebufferSize = Math.floor(sampleRate * (new Date().valueOf() - started) / 1000);

    if (audio.mozCurrentSampleOffset()) { // Play position moved?

    autoLatency = false;

    }

    }

    Complete Example: Creating a Web Based Tone Generator

    This example creates a simple tone generator, and plays the resulting tone.

    JavaScript Audio Write Example

    Hz

    play

    stop

    function AudioDataDestination(sampleRate, readFn) {

    // Initialize the audio output.

    var audio = new Audio();

    audio.mozSetup(1, sampleRate);

    var currentWritePosition = 0;

    var prebufferSize = sampleRate / 2; // buffer 500ms

    var tail = null, tailPosition;

    // The function called with regular interval to populate

    // the audio output buffer.

    setInterval(function() {

    var written;

    // Check if some data was not written in previous attempts.

    if(tail) {written = audio.mozWriteAudio(tail.subarray(tailPosition));

    currentWritePosition += written;

    tailPosition += written;

    if(tailPosition < tail.length) {

    // Not all the data was written, saving the tail...

    return; // ... and exit the function.

    }

    tail = null;

    }

    // Check if we need add some data to the audio output.

    var currentPosition = audio.mozCurrentSampleOffset();

    var available = currentPosition + prebufferSize - currentWritePosition;

    if(available > 0) {

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    4 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    9/14

    // Request some sound data from the callback function.

    var soundData = new Float32Array(available);

    readFn(soundData);

    // Writting the data.

    written = audio.mozWriteAudio(soundData);

    if(written < soundData.length) {

    // Not all the data was written, saving the tail.

    tail = soundData;

    tailPosition = written;

    }

    currentWritePosition += written;

    }

    }, 100);

    }

    // Control and generate the sound.

    var frequency = 0, currentSoundSample;

    var sampleRate = 44100;

    function requestSoundData(soundData) {

    if (!frequency) {

    return; // no sound selected

    }

    var k = 2* Math.PI * frequency / sampleRate;

    for (var i=0, size=soundData.length; i

  • 7/31/2019 Audio Data API - MozillaWiki

    10/14

    normalized to a length of channels * 1024 by default, but could be any length between 512 and 16384 if the user

    has set a different length using the mozFrameBufferLength attribute.

    The time attribute contains a float representing the time in seconds of the first sample in the frameBuffer array

    since the start of the audio track.

    nsIDOMHTMLMediaElement additions

    Audio metadata is made available via three new attributes on the HTMLMediaElement. By default these

    attributes throw if accessed before the loadedmetadata event occurs. Users who need this info before the audio

    starts playing should not use autoplay, since the audio might start before a loadmetadata handler has run.

    The three new attributes are defined as follows:

    readonly attribute unsigned long mozChannels;

    readonly attribute unsigned long mozSampleRate;

    attribute unsigned long mozFrameBufferLength;

    The mozChannels attribute contains the number of channels in the audio resource (e.g., 2). The

    mozSampleRate attribute contains the number of samples per second that will be played, for example 44100.

    Both are read-only.

    The mozFrameBufferLength attribute indicates the number of samples that will be returned in the framebuffer

    of each MozAudioAvailable event. This number is a total for all channels, and by default is set to be the

    number of channels * 1024 (e.g., 2 channels * 1024 samples = 2048 total).

    The mozFrameBufferLength attribute can also be set to a new value, if users want lower latency, or larger

    amounts of data, etc. The size given must be a number between 512 and 16384. Using any other size will result

    in an exception being thrown. The best time to set a new length is after the loadedmetadata event fires, when

    the audio info is known, but before the audio has started or MozAudioAvailable events begun firing.

    nsIDOMHTMLAudioElement additions

    The HTMLAudioElement has also been extended to allow write access. Audio writing is achieved by adding

    three new methods:

    void mozSetup(in long channels, in long rate);

    unsigned long mozWriteAudio(array); // array is Array() or Float32Array()

    unsigned long long mozCurrentSampleOffset();

    The mozSetup() method allows an element to be setup for writing from script. This method must becalled before mozWriteAudio or mozCurrentSampleOffset can be called, since an audio stream has to be

    created for the media element. It takes two arguments:

    channels - the number of audio channels (e.g., 2)1.

    rate - the audio's sample rate (e.g., 44100 samples per second)2.

    The choices made for channel and rate are significant, because they determine the amount of data you must

    pass to mozWriteAudio(). That is, you must pass an array with enough data for each channel specified in

    mozSetup().

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    14 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    11/14

    The mozSetup() method, if called more than once, will recreate a new audio stream (destroying an existing one

    if present) with each call. Thus it is safe to call this more than once, but unnecessary.

    The mozWriteAudio() method can be called after mozSetup(). It allows audio data to be written directly from

    script. It takes one argument, array. This is a JS Array (i.e., new Array()) or a typed float array (i.e., new

    Float32Array()) containing the audio data (floats) you wish to write. It must be 0 or N elements in length, where

    N % channels == 0, otherwise an exception is thrown.

    The mozWriteAudio() method returns the number of samples that were just written, which may or may not bethe same as the number in array. Only the number of samples that can be written without blocking the audio

    hardware will be written. It is the responsibility of the caller to deal with any samples that don't get written in the

    first pass (e.g., buffer and write in the next call).

    The mozCurrentSampleOffset() method can be called after mozSetup(). It returns the current position

    (measured in samples) of the audio stream. This is useful when determining how much data to write with

    mozWriteAudio().

    All ofmozWriteAudio(), mozCurrentSampleOffset(), and mozSetup() will throw exceptions if called out of

    order. mozSetup() will also throw if a src attribute has previously been set on the audio element (i.e., you can't

    do both at the same time).

    Security

    Similar to the element and its getImageData method, the MozAudioAvailable event's frameBuffer

    attribute protects against information leakage between origins.

    The MozAudioAvailable event's frameBuffer attribute will throw if the origin of audio resource does not match

    the document's origin. NOTE: this will affect users who have the security.fileuri.strict_origin_policy set, and are

    working locally with file:/// URIs.

    Compatibility with Audio Backends

    The current MozAudioAvailable implementation integrates with Mozilla's decoder abstract base classes, and

    therefore, any audio decoder which uses these base classes automatically dispatches MozAudioAvailable events.

    At the time of writing, this includes the Ogg, WebM, and Wave decoders.

    Additional Resources

    A series of blog posts document the evolution and implementation of this API: http://vocamus.net/dave/?cat=25

    Another overview by Al MacDonald is available here.

    Al has also written 2 very good tutorials and video demos of reading and writing audio with the API.

    The BBC Research and Development Blog has also done an excellent overview of the API here.

    Bug

    The work on this API is available in Mozilla bug 490705.

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    14 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    12/14

    Obtaining Builds

    Firefox trunk nightlies include the Audio Data API (starting with 2010-08-26 builds).

    JavaScript Audio Libraries

    We have started work on a JavaScript library to make building audio web apps easier. Details are here and

    https://github.com/corbanbrook/dsp.js.audionode.js acts as a javascript bridge between the Web Audio API and the Audio Data API allowing us

    to run the examples http://weare.buildingsky.net/processing/audionode.js/examples/index.html.

    Audio Data API Objects - An high level abstraction (and an usage example) of the Audio Data API.

    dynamicaudio.js - An interface for writing audio with a Flash fall back for older browsers.

    Beat Detektor by Charles Cliffe, uses dsp.js to add beat detection.

    audiolib.js by Jussi Kalliokoski, a powerful audio tools library for JavaScript, compatible with the Audio

    Data API and Chrome's Audio API.

    Audiolet - A JavaScript library for real-time audio synthesis and composition from within the browser

    XAudioJS - A JavaScript library that provides a raw audio sample writing access to the mozilla audio data

    and web audio APIs. Provides a basic write and callback system so the developer can be assured to have

    gapless audio for these two APIs. Also provides a fallback WAV PCM data URI generator that is notguaranteed to be gapless.

    Music.js, library containing functions and data sets to generate notes, intervals, chords, and scales

    Javascript .MOD and .XM music player

    Working Audio Data Demos

    NOTE: we recently took down two servers that were hosting many of these demos. We are working to find

    a new home for them.

    A number of working demos have been created, including:

    Overview slideshow demo of various features (video here)

    Audio Visualizations

    http://tllabs.io/audiopaper/ paper.js audio visualization

    http://traction.untergrund.net/slamdown/

    http://www.nihilogic.dk/labs/pocket_full_of_html5/ (Demo by Jacob Seidelin)

    http://weare.buildingsky.net/processing/dsp.js/examples/fft.html

    http://www.storiesinflight.com/jsfft/visualizer/index.html (Demo by Thomas Sturm)

    http://www.grantgalitz.org/sound_test/ WAV Decoder & Visualizer (Pre-loaded)

    http://www.grantgalitz.org/wav_player/ WAV Decoder & Visualizer (Load in your own .wav)

    Applying Realtime Audio Effects

    Volume, pitch, etc. UI for audio - https://developer.mozilla.org/en-US/demos/detail/voron

    (homepage: http://kievII.net)

    JS IIR Filter http://weare.buildingsky.net/processing/dsp.js/examples/filter.html (video here)

    Vocodes a formant with a carrier wave http://weare.buildingsky.net/processing/dsp.js/examples

    /vocoder.html

    Biquad Filter example http://weare.buildingsky.net/processing/dsp.js/examples/biquad.html

    Graphic EQ example http://weare.buildingsky.net/processing/dsp.js/examples/grapheq.html

    Delay effect http://code.almeros.com/code-examples/delay-firefox-audio-api/ (video of older

    version here)

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    14 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    13/14

    Reverb effect http://code.almeros.com/code-examples/reverb-firefox-audio-api/ (video here)

    Generating and Playing Audio

    http://bitterspring.net/blog/2012/01/25/morning-star-synth-0-1-released/

    http://onlinetonegenerator.com/

    mp3 decoder in js

    mp2 decoder in js

    Ambient techno machine

    Music.js, library containing functions and data sets to generate notes, intervals, chords, and scalesHTML5 Guitar Tab Player

    Tone matrix using Audiolet.js

    Generating music in JS via audiolet.js, breakbeat demo

    sfxr.js - sound effect generator/editor for video games.

    JavaScript Chip Tracker app (demo by Jonathan Brodsky)

    JavaScript Audio Sampler http://weare.buildingsky.net/processing/dsp.js/examples/sampler.html

    SamplePlayer, SampleLoader, Sequencer and Keyboard http://code.almeros.com/code-examples

    /sampler-firefox-audio-api/ (video here)

    Square Wave Generation http://weare.buildingsky.net/processing/dsp.js/examples/squarewave.html

    Random Noise Generation http://weare.buildingsky.net/processing/dsp.js/examples/nowave.html

    JS Multi-Oscillator Synthesizer http://weare.buildingsky.net/processing/dsp.js/examples/synthesizer.html (video here)

    Bloop http://async5.org/audiodata/examples/bloop-ea/bloop-audiodata.html

    JavaScript Text to Speech engine http://async5.org/audiodata/tts/index.html

    Toy Piano http://async5.org/audiodata/examples/piano.html (and the sample-based piano

    http://async5.org/audiodata/examples/piano-s/piano2.html)

    Csound Shaker Instrument http://async5.org/audiodata/csound/shaker.htm and Bar Instrument

    http://async5.org/audiodata/csound/bar.htm

    Canon Theremin Piano http://mtg.upf.edu/static/media/canon-theremin-piano.html (by Zacharias

    Vamvakousis [email protected]).

    Manipulate music example using mouse and accelerometer http://blog.dt.in.th/stuff/audiodata/ (Thai

    Pangsakulyanont)Tuning exploration, Wicki keyboard and Karplus-Strong synthesizer http://www.toverlamp.org

    /static/wickisynth/wickisynth.html (Piers Titus van der Torren)

    Modular Synthesizer with MIDI, control and audio ports. http://www.niiden.com/jstmodular/ (Jussi

    Kalliokoski)

    Dual-axis Theremin controlling pitch and volume with cursor position. http://stu.ie/?p=2599 (Stuart

    Gilbert)

    JavaScript "Image to Sound" generator http://zhangjw.bai-hua.org/audio_test6.html (ZhangJW)

    XAudioJS library test page http://www.grantgalitz.org/sound_test/

    Beat Detection (also showing use of WebGL for 3D visualizations)

    http://people.mozilla.com/~prouget/demos/boomboom/index.htmlhttp://cubicvr.org/CubicVR.js/bd3/BeatDetektor1HD.html (video here)

    http://cubicvr.org/CubicVR.js/bd3/BeatDetektor2HD.html (video of older version here)

    http://cubicvr.org/CubicVR.js/bd3/BeatDetektor3HD.html (video here)

    http://cubicvr.org/CubicVR.js/bd3/BeatDetektor3HDFX.html (same, but with more effects)

    http://cubicvr.org/CubicVR.js/bd3/BeatDetektor4HD.html (video here)

    http://cubicvr.org/CubicVR.js/bd_fluid_sim/BD_GPUFluid.html

    Writing Audio from JavaScript, Digital Signal Processing

    API Example: Inverted Waveform Cancellation

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_

    14 10/4/2012 1

  • 7/31/2019 Audio Data API - MozillaWiki

    14/14

    API Example: Stereo Splitting and Panning

    API Example: Mid-Side Microphone Decoder

    API Example: Ambient Extraction Mixer

    API Example: Worker Thread Audio Processing

    Audio Games

    http://www.oampo.co.uk/labs/fireflies/

    http://www.oampo.co.uk/labs/siren-song/

    Demos Needing to be Updated to New API

    FFT visualization (calculated with js)

    Experimental JavaScript port Pure Data http://mccormick.cx/dev/webpd/ with demo

    http://mccormick.cx/dev/webpd/demos/processingjs-basic-example-with-audio/index.html

    http://ondras.zarovi.cz/demos/audio/

    http://code.bocoup.com/processing-js/3d-fft/viz.xhtml

    Visualizing sound using the video element

    http://bocoup.com/core/code/firefox-audio/whale-fft2/whale-fft.html (video here)

    Third Party Discussions

    A number of people have written about our work, including:

    http://ajaxian.com/archives/amazing-audio-sampling-in-javascript-with-firefox

    http://createdigitalmusic.com/2010/05/03/real-sound-synthesis-now-an-open-standard-in-the-browser/

    http://www.webmonkey.com/2010/05/new-html5-tools-make-your-browser-sing-and-dance/

    http://www.wired.co.uk/news/archive/2010-05/04/new-html5-tools-give-adobe-flash-the-fingerhttp://hacks.mozilla.org/2010/04/beyond-html5-experiments-with-interactive-audio/

    http://schepers.cc/?p=212

    http://createdigitalmusic.com/2010/05/27/browser-madness-3d-music-mountainscapes-web-based-

    pd-patching/

    http://news.slashdot.org/story/10/05/26/1936224/Breakthroughs-In-HTML-Audio-Via-Manipulation-

    With-JavaScript

    http://ajaxian.com/archives/amazing-audio-api-javascript-demos

    http://www.webmonkey.com/2010/08/sampleplayer-makes-your-browser-sing-sans-flash/

    Retrieved from "https://wiki.mozilla.org/Audio_Data_API"

    This page was last modified on 21 August 2012, at 16:25. This page has been accessed 176,987 times.

    About MozillaWiki

    Disclaimers

    Privacy Policy

    o Data API - MozillaWiki https://wiki.mozilla.org/Audio_