Skip to content
Jean Bresson edited this page Jul 30, 2019 · 12 revisions

OM-Chant project

OM-Chant is a library for the control and integration of the Chant synthesiser in OpenMusic. This integration implies specific issues related to the "phrase-oriented" and continuous aspects of the control of Chant, and provides innovative approaches and sound creation processes to be implemented in the computer-aided composition environment.

sonagram

About Chant

The Chant synthesiser was developed by Xavier Rodet and his team at IRCAM in the early 80s, at the origin in order to simulate and reproduce sung voice sounds, but also in order to create more abstract sounds using the same production model.

Chant implements the technique of "formant wave functions" (FOF / Fonctions d'Ondes Formantiques in French). FOF synthesis models the vocal production system considering an excitation signal and its response to a resonant system. The FOF synthesizer generates trains of small signals (simple sinusoids multiplied by an exponential envelope) at a given frequency (called fundamental frequency / f0), which each contribute to the overall spectrum of the resulting signal with a "formant" (a FOF synthesiser is compound of several FOF generators in parallel, each one producing one formant).

Formants are modulations of the spectrum characterizing vocal signals. They can be described with few simple and intuitive parameters (central frequency, amplitude, bandwidth, etc.) which have a direct matching with the parameters of the FOFs.

The Chant synthesiser is initialised by the connexion of several synthesis or processing modules. These modules can be FOF generators, but also noise generators, sounds, or resonant filter banks. The control is performed by independent processes setting and updating periodically the values of the different parameters of these modules.

A set of rules have been defined and integrated in early implementations of the synthesiser, allowing to obtain more "natural sounds" by computing or linking the value of some parameters to the evolution of others (e.g., say, the formant frequency may slightly vary on the amplitude and fundamental frequency, etc.) Standard controls such as the jitter (parametrized random perturbations around a given parameter value) or the vibrato (sinusoidal modulation) also allow to add quality and realism to the synthesised sounds.

Chant was integrated in several environments and projects focusing on the control of sound synthesis. In Formes, high-level processes could be defined in a context integrating the control rules to advanced temporal structures. The Diphone software, more than ten years later, allowed to combine graphically static parameters blocks in order to generate continuous phrases by interpolation, but a the same time abandoned the "rule-based" synthesis aspects. Very few possibilities of were still available for the use of Chant,1) until its recent integration to OpenMusic.

OM-Chant

The basic the OM-Chant library allows to create Chant control-parameter data (formetted as SDIF files) in visual programs and compositional processes developed in OM. sent to the Chant synthesizer to produce sounds . The notion of synthesis events, inspired from a similar concept in the OMChroma framework, his used to define high-level control structures in the form of parameter matrices [parameters x components].

A synthesis events is a symbolic entity which can be created and manipulated in high-level compositional processes, and composed in temproal structures. Each synthesis event (or matrix) has a number of parameters (e.g. frequency, amplitude, bandwidth) defined for a number of "components" (typically, the different formants in a Chant synthesis process). The synthesis events have an "onset" (also called "action time") and a duration allowing to situate them in the time structure. As a result, each event produces at least 2 control-frames in the corresponding to the state of a module at the event onset and at its end (onset + duration).

Different types (or classes) of Chant synthesis events have been defined to control the FOF module parameters (classes CH-FOF and CH-F0), the resonant filters (class CH-FLT) or the noise generators (CH-NOISE). CH-F0 and CH-NOISE are not matrices but scalar controllers.

Continuous Control

The Chant synthesis events created in an OM patch are translated in a continuous stream of SDIF frames controlling the synthesize. This transfer from the discrete domain of composition to the continuous domain of sound synthesis is one of the major challenges and interests of the OM-Chant library.

The control and internal structure of the OM-Chant framework allow for a "continuous" approach of the sound synthesis processes with Chant. While the compositional framework in OM is discrete and "event-based", Chant takes the control sequence to produce a monophonic "phrase" out of the specified state and variation of the parameters.

The evolution of the paramaters can controlled at different levels in the control processes :

  • Inside events: each "cell" in the event can actually be defined as constant parameter value or as an evolution of this parameter, using BPF objects. All points in the BPF will then be considered as a state in the control-file. Several tools in the library allow to define such "continuous" behavior using modulations such as the jitter or vibrato effects. In the (frequent) case where several parameters may change during a same event, and in order to reduce computation costs, it is possible to specify a common sample rate of the control sequence, at which all parameters in the event will be sampled.

  • Between events: due to the same "monophonic" characteristics, the possible overlaps or gaps between events must be handled explicitly in the control process, in oder to determine the parameter state sequence corresponding to a transition, sequencing or other behavior related to the overlapping intervals. A higher-order function in OM-Chant (ch-transitions) allows to deal with successive pairs of events in a temporally-sorted list, and generate a new sequence including user-defined transitions (e.g. silence btween the events, cross-fading of the parameters, interpolation, or any other transition process).

Further reading

Computer Music Journal 8(3), 1984, contains 3 reference articles about the chant project and the original works of Xavier Rodet and the Analysis/Synthesis team:

  • Rodet, X. Time-domain Formant-wave Function Synthesis
  • Rodet, X., Potard, Y., Barrière, J.-B. The CHANT Project : From the Synthesis of the Singing Voice to Synthesis in General
  • Rodet, X., Cointe, P. Formes: Compostion and Scheduling of Processes

About the integration of Chant in OpenMusic:

More on this wiki: