-
Notifications
You must be signed in to change notification settings - Fork 4
Synth
The Synth element is a container with Voice elements. It is used to respond to incoming events from virtual keyboards or via MIDI. A Synth element can contain multiple Voice elements to have differing notes triggering different sounds. If the Synth has two Voice elements, the first incoming event will trigger the first Voice element and the second event the second Voice element. The third event will use the first Voice and the fourth event the seconds Voice and so forth. The Synth element does not make any sound until it receives an incoming event. If it contains Envelope elements controlling a GainNode it will use them to trig sounds. If there are no Envelope elements inside the Voice it will turn on and off the gain of the Voice element when receiving Note On and Note Off events. The Synth element sends the incoming MIDI notes to all subsequent elements with the "follow" attribute set to "MIDI" which makes it possible to control the frequency of an OscillatorNode to match the played MIDI note.
The Synth element can have the following attributes:
The "follow"-attribute is the key to make the Synth react to events. It is specified with three comma (or space) separated values:
- The CSS-selector used to target the object to listen for
- The event name used by the target object
- The name of the property for the event object sent to the objects eventListener
index.html:
...
<webaudio-keyboard keys="25" min="48" width="660" height="250"></webaudio-keyboard>
<script src="https://g200kg.github.io/webaudio-controls/webaudio-controls.js" ></script>
...
WebAudioXML:
...
<Synth follow="webaudio-keyboard, change, e.note"></Synth>
...
- The first follow-value in this example uses the CSS-selector "webaudio-keyboard" to target the webaudio-keyboard HTML element. (webaudio-keyboard is a custom HTML-element in the webaudio-control-library.
- The second value is set to "change" to be triggered by the "webaudio-keyboard" elements change-event.
- The third value is set to "e.note" (it's possible to use either "e" or "event" as the keyword) to read the "note"-property of the event sent to the eventListener. This means that the example above results in the following javascript to be executed by WebAudioXML:
document.querySelector("webaudio-keyboard").addEventListener("change" e => {
webAudioXML.querySelector("Synth").play(e.note);
});
The maximum number of Voicees of the Synth. The more Voicees, the more CPU required.
Controls the overall output volume of the Synth. Can be set in dB or power value.
The attribute "portamento" can be set to any value to control the transition time for oscillator frequency change. This makes it possible for oscillatorNodes to glide between notes. The portamento attribute is inherited to all child nodes but only affects the transition time for frequency change on oscillatorNodes.
<?xml version="1.0" encoding="UTF-8"?>
<Audio version="1.0" timeUnit="ms">
<Synth follow="webaudio-keyboard, change, e.note" voices="8">
<Voice>
<Chain>
<OscillatorNode type="sawtooth">
<frequency follow="MIDI"></frequency>
</OscillatorNode>
<GainNode>
<gain>
<Envelope adsr="100, 200, 50, 200" max="1"></Envelope>
</gain>
</GainNode>
</Chain>
</Voice>
</Synth>
</Audio>
This example shows a 8-note polyphonic Synth with one Voice element used to make the sound for all events received by the Synth (read more about the Voice element in its separate page). The "follow"-attribute is set to listen to the event "change" on the WebAudioXML-keyboard HTML element. It reads the "note"-property of the event-object to retrieve the MIDI-note played by the keyboard. This setup uses the webaudio-keyboard element provided by webaudio-controls. If MIDI in is activated for the webaudio-keyboard, it will be passed into WebAudioXML as well.
Please follow my research journey at http://hans.arapoviclindetorp.se and https://www.facebook.com/hanslindetorpresearch/
- Collaborative music-making: special educational needs school assistants as facilitators in performances with accessible digital musical instruments (Frontiers in Computer Science 2023)
- Playing the Design: Creating Soundscapes through Playful Interaction (SMC2023)
- Accessible sonification of movement: A case in Swedish folk dance (SMC2023)
- Evaluating Web Audio for Learning, Accessibility, and Distribution (JAES2022)
- Audio Parameter Mapping Made Explicit Using WebAudioXML (SMC2021)
- Putting Web Audio API To The Test: Introducing WebAudioXML As A Pedagogical Platform (WAC2021)
- Sonification for everyone everywhere – Evaluating the WebAudioXML Sonification Toolkit for browsers (ICAD2021)
- WebAudioXML: Proposing a new standard for structuring web audio (SMC2020)