A WAM consists of Controller and Processor parts as shown in Fig. 1. The Controller exposes a JavaScript API, interfaces with other Web APIs, and optionally provides the GUI. The Processor implements signal processing algorithms in JavaScript or cross-compiled C/C++. As dictated by the W3C Web Audio API spec (R-01), the parts shall reside in separate threads and communicate through the datachannel using asynchronous events (R-02). In most cases, the events flow in one direction (from Controller to the Processor), and request-reply communication is only needed during the initialization phase. The events are parsed and translated into method invocations at the Processor end in the wrapper API, which is exposed as a JS prototype or C/C++ header file. There is no traditional plugin host concept in the API. Instead, the Controller hosts the Processor directly, and all interaction with the WAM and application code happens through Controller. This resolves synchronization issues present for instance in VST3 (G-02).

WAM Architecture
Fig 1. WAM Architecture

The division of functionality between the parts is as follows. The Controller holds the state (e.g., parameter values, loading and saving them from/into patches), while Processor implements the DSP (reflecting the parameter values as properly scaled synthesis parameters). The parameter space and audio+event I/O configuration may be declared as a JSON descriptor during initialization time, either at the Controller side or the Processor side (latter preferred). The format of the JSON descriptor is described in Controller parameter handling section below. GUIs are outside the scope of this proposal, although they benefit from the routines defined in this document.

WAM lifecycle

Fig. 2 shows a sequence diagram of a WAM’s lifecycle. First, setup(.) loads mywam-processor.js script (which contains the DSP code in vanilla JS or in asm.js), creates the Processor instance, initializes it, and enters runtime stage of the lifecycle. Runtime stage is aborted by invoking terminate(.), which disconnects the virtual AudioNode from the WAA audio graph, disconnects Midi ports, and blocks the datachannel between Controller and Processor. Processor is eventually disposed by garbage collection. Runtime functionality is available through set/getParameter, setPatch, postMidi, and postMessage functions, and onMessage callback invoked from the Processor side.

WAM lifecycle
Fig 2. WAM lifecycle


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">