Example : JavaScript Web Audio Module 2.0 Processor with Plugin Automation

Specifications :

In this example, we will show how to automatize the parameters of a plugin. We will automatize the tone parameter of the Big Muff plugin. The tone will start from the minimum value to the maximum value. At half time of the music, the tone will decrease from the maximum to the minimum.

Prerequisites :

The prerequisites of this example are the same as example 3. We will only add to the processor the events handlers for the automation.

Simulate automation :

We will retrieve the parameters' information to add automation to the plugin. Then we will use them to simulate up-and-down automation of the tone parameter.

// automation.js

export default async function applyAutomation(node, plugin, duration) {
    let info = await plugin._audioNode.getParameterInfo();

    const {minValue, maxValue, label, id} = Object.values(info)[4];
    console.log(minValue, maxValue, label, id);

    let parameterPoints = [
        {value: minValue, time: 0},
        {value: maxValue, time: duration/2},
        {value: minValue, time: duration}
    ];
    let normalizedPoints = normalizePoints(parameterPoints, minValue, maxValue, 0.5, duration, 0.01);

    let events = [];
    for (let i = 0; i<normalizedPoints.length; i++) {
        events.push({type: 'wam-automation', data: {id: id, value: normalizedPoints[i].value}, time: audioCtx.currentTime + normalizedPoints[i].time});
    }
    node.scheduleEvents(...events);
}

The parameterPoints are 3 points for the tone parameter. One at the beginning of the song with the minimum value, a second at half of the time to the maximum value, and the last one at the end of the song to the minimum value. We will use a function to normalize the points between the user's points. To do this, we will find the linear function of two successive points.

To achieve this, we use this simple function :

// utils.js to import in index.js

// Get from the linear function of points a and b, the value of the parameter at time t.
function getXFromY(a, b, t) {
    let gradient = (b.value - a.value)/(b.time - a.time);
    let intercept = b.value - gradient * b.time;
    return gradient * t + intercept;
}

function normalizePoints(points, minValue, maxValue, defValue, duration, step) {
    // If there is no point defined by the users.
    if (points.length === 0) {
        points.push({value: defValue, time: 0});
    }
    let firstPoint = points[0];
    let lastPoint = points[points.length-1];

    // If the first user point isn't set at the beginning of the audio.
    if (firstPoint.time !== 0) {
        points.unshift({value: defValue, time: 0});
    }
    // If the last user point isn't set at the end of the audio.
    if (lastPoint.time !== duration) {
        points.push({value: lastPoint.value, time: duration});
    }

    let normalizedPoints = []
    let pointIndex = 0;
    for (let t = 0; t < duration; t += step) {
        if (t > points[pointIndex+1].time) {
            pointIndex++;
        }
        let valueAtT = getXFromY(points[pointIndex], points[pointIndex+1], t);
        // If the current point is at the same time that the previous value, set to the previous value...
        if (isNaN(valueAtT)) {
            valueAtT = normalizedPoints[normalizedPoints.length-1].value;
        }
        normalizedPoints.push({value: valueAtT, time: t});
    }
    // Add the last point.
    normalizedPoints.push(points[pointIndex+1]);
    return normalizedPoints;
}

With this function we can give 3 points, and with a precision of k step, normalize the points. After the points are normalized we can send them to the processor and queue them as events thanks to web audio modules.

Connecting the plugins to the WAM events :

To handle the parameter events from the processor, we need to connect them to the Web Audio Module API.

// index.js

(async ()=> {
    audioCtx.suspend()

    /* Code of the host
       ...
    */

    node.connectEvents(pluginInstance1._audioNode.instanceId);
    node.connectEvents(pluginInstance2._audioNode.instanceId);

    await applyAutomation(node, pluginInstance2, audioBuffer.duration);

    /*
     ...
    */

}();

After connecting the plugins, we can now apply the automation. We will emit events to the processor. Events are composed of the type, the parameter identifier, the value of the parameter, and the time.

Now that the automation is sent, we want to be cautious about the audio context behavior. We must be aware of when to stop and when to resume the audio context. Because the events are scheduled with the audio context time.

Handling events in the processor :

We will activate the automation events in the wam node. To achieve this only add this line to the audio node constructor :

this._supportedEventTypes = new Set(['wam-automation']);

To see all the supported events, refer to the wam api documentation.

The processor need to handle events. To do that, we will add some methods in the processor code :

// wam-audio-player-processor.js

class MyWamProcessor extends ModuleScope.WamProcessor {

    async _onMessage(e) {
        await super._onMessage(e);
        if (e.data.audio) {
            this.audio = e.data.audio;
        } else if (typeof e.data.position === "number") {
            this.playhead = e.data.position * sampleRate;
        } else if (e.data.restart) {
            this.playhead = 0;
        }
    }

    /**
     * Process the events received.
     */
    _processEvent(event) {
        this.emitEvents(event);
    }

    /**
     * When the automations events are called, this function is called.
     */
    _process() {}
}

By calling the _onMessage super, when events are emitted from the main thread, the processor can schedule the events in the WAM module. When an event previously scheduled is called during the song, _processEvent will emit events to the wam group in the main thread. Plugins that are connected to the wam group will receive the events.

Conclusion :

You've seen how to use parameter automation in a wam plugin. For further more details, please see the full code in the github repository. Next example we will see how to implement a simple vue meter.