Guests   Partners
 
 
i-maestro login
Events

Home
Video
Overview
Project Info
Slides
Downloads
Tools
for Teachers
for Students
Contacts
Partners
Workshops
Links
Registration



Valid HTML 4.0 Transitional CSS Valido!


PROJECT TOOLS



i-Maestro Sound and Gesture Practice Training Tools


The i-Maestro Sound and Gesture Practice Training Tools are a set of technologies providing teachers and students with innovative means in the field of music pedagogy. The toolbox consists of modular components, integrable in any pedagogical workflow and favouring music embodiment based on gesture-sound interaction.

Musical gesture

The Gesture Sound Practice Training Tools allows rethinking gestures and movements in music practice. The gesture sensing technology in conjunction with interactive real-time audio processing and gesture following allows new forms of interaction and performance with recorded, live, or electronic sounds.

students creating electronic sounds
Students creating electronic sounds in real time with gestures accompanying a movie

Using gesture-following, students and teachers can reinterpret with free gestures a recorded sound or piece of music taking the perspective of a listener or a performer. In a first step, a gesture is recorded, while listening to a piece of music. In a second step, the music can be replayed varying the sound with the gestures. Any kind of gestures can be employed such as conducting patterns, instrumental gestures or free choreography. Allowing much freedom on timing musical phrases, this approach particularly helps students focus on musical phenomena such as rubato, cadenzas, phrasing.

Student reinterpreting a recording 1 Student reinterpreting a recording 2 Student reinterpreting a recording 3
Student reinterpreting a recording of Beethoven's 5th symphony

Accompaniment

Accompaniment with recorded or electronic sounds can create a musical context to a soloist voice in music practicing as well as in performance situations. Especially for individual practicing the accompaniment can provide a dynamic harmonic and rhythmic reference and/or counterpart, helping the student to focus on a given aspect of his playing.

In this context, gesture-following or score-following allows the accompaniment of a soloist performance with a second voice or an orchestral part that follows the soloists tempo variation and rubatos. While score-following is based on the symbolic music representation of the performed piece, gesture-following takes into account the performer's gesture in its continuity and fine articulation also including anticipatory elements.

Student accompanied by an audio/video recording
Student accompanied by an audio/video recording of her teacher using gesture-following

In collective situations, traditional instrument players can play together with an accompaniment of players performing recorded and electronic sounds using gesture-following.

Students performing with i-Maestro 1 Students performing with i-Maestro 2
Students performing with i-Maestro technology in a public recital

Recording and Inspection

The toolbox provides a set of tools for the recording of sound and gesture of a musical performance. The captured sound and gestures can be visualised, sonified and interactively browsed and "inspected".

Teacher and student
Teacher and student in a violin class

Several analysed audio descriptors (such as pitch, energy, timbre) can be visualised aligned to the audio waveform and the captured gesture parameters. Advanced audio rendering allows "scrubbing" and browsing into these various representation of musical data. Visualisation as well as sonification of these parameters can create different perspectives on the recorded performance enabling a higher-level inspection. Specific aspects of sound and gesture can be magnified such as the stiffness of a movement, or difficulties in articulation or intonation.

Score-following is used here to create a temporal correspondence between the recording and the score of the performed piece allowing for synchronised/aligned visualisation and inspection.

Technology and Components

Gesture Sensing System

The light-weight gesture sensing module is composed of a micro-processor and wireless transmitter conditioning and transmitting the data of attached motion sensors capturing acceleration and rotation. The module can be attached to the body or held in the hand. For a violin player, the module is attached to the wrist and the sensors mounted on the frog of the bow. The captured gesture data is wirelessly transmitted to a second module connected to the computer where it can be recorded and processed in real-time.

Gesture sensing system mounted on a violin bow
Gesture sensing system mounted on a violin bow

Gesture Following

The gesture-following component allows for the recognition and alignment of gestures based on a set of recorded gestures. This technology can be used for the control of variable speed audio playback by variations of previously recorded gestures. Further applications are the recognition and classification of violin bow-strokes and other musical gestures.

Score Following

The score-following component aligns a captured performance in real-time to a symbolic representation of the performed score. This technique can be used for automatic accompaniment when a recomposed or recorded accompaniment is synchronised in real-time with the performance. In i-Maestro, the alignment is also used to create a correspondence between interactive graphical representations of the score and recorded sounds and gestures.

Audio Analysis and Visualisation of Sound and Gesture Data

The toolbox includes a set of components for the extraction of audio descriptors such as pitch and loudness and well as timbre parameters such as noisiness and brilliance. These descriptors can be visualised and sonified in conjunction with movement parameters allowing for detailed inspection of recorded performances.

Interactive Audio Processing

A set of interactive audio processing tools has been integrated into the toolbox. The Modular Audio Processing Framework allows for the rapid composition of audio synthesis and transformation components (sound generators, filters, audio effects) as well as the creation of mappings to their real-time control by gestures or extracted audio parameters. For variable speed playback and the exploration of recordings by high-quality "scrubbing" and freezing an advanced phase vocoder and a granular synthesis module have been included to the toolbox.

The Sound and Gesture Lab Prototype Application

The above technology components have been integrated into a prototype application that allows experimentation with teachers and students in a pedagogical context. The prototype application represents an open framework for the exploration of these technologies in music pedagogy and also provides a set of predefined ready-to-use scenarios with simple instructions.

Screenshot of the Sound and Gesture Lab prototype application
Screenshot of the Sound and Gesture Lab prototype application
  Additional resources
VIDEO 1

HTML icon[JPO2.mp4, 2'33]

Forbidden Planet,
Paris, 2006/2007.

This video shows three students participating to a creative project using i-Maestro technology. While viewing a scene from the science fiction movie "Forbidden Planet" without the sound track, they improvise the musical environment with their gestures controlling electronic treatments using sensors. The pedagogical roadmap of this project has been created in collaboration with teachers and students from multiple schools. The movie being based on the Shakespeare play "The Tempest", the French and English teacher have made special courses focusing on the characters. An involved music teacher made also special courses on electronic music.

The project has included the following steps:

  • Study early electronic music, build musical electronic processes
  • Study various musical gestures (both instrumental and free gestures)
  • Put the students into the situation of the movie theatre"s pianist during the silent movies period: project the scenes of the movie without the soundtrack, and ask students teams to rebuild the soundscape of the scene using sensors controlling audio and musical processes.
  • Give the resulting soundfiles to fine art teachers, without showing the images from which they were made. Ask the student to create those images.
  • Make an Award ceremony at the prestigious Lycée Henri IV in june 2007 to reward the most innovative relationship between sound and image
  • Analysis of the result with students and teachers:
    • Is there a link between the original sound and the sound created by the students?
    • Is there a link between the original images and the images created by the students?
    • Is there a link between the original relationship between sound and image and the relations made by the students?
    • Is there any information that did survive all those transformations that the students preserved consciously or unconsciously?


VIDEO 2

HTML icon[Berenice2.mov, 1'33]

Beethoven 5th symphony
Atelier des Feuillantines, 2007/2008.

This experiment was following a first one, using the gesture follower and the sensor module in a pedagogical context, namely a music theory class. This first attempt was targeted on a very well defined gesture: the beating of 3/4 and 4/4 metrics. The pre-required skills were both, theoretical and physical. This second experiment with students has been conducted by a student having experience in dancing, in the context of a creative project involving different 15 - 16 years old pupils with their teacher. In the following experiment, the student improvises gestures and dancing movements while hearing the second movement of Beethoven's fifth symphony. The improvisation scheme tries to map expressive gestures on the phrasing perceived in real time. In a first step, the sensor values are recorded, while the sound file is played normaly. In a second step, the recording is processed using granular synthesis algorithm. The sound is slowed down and sped up following the gesture follower's results. While continuous gestures are easily followed, a clear "break" between different "continuities" allows the student to "jump" to different part of the recording after having decided in the first "learning phase" the segmentation of the music.


VIDEO 3

HTML icon[Demo_follower.mov, 2'23]

HTML icon[Demo_follower2.mov, 1'34]

HTML icon[Berio-premiere-version.mp4, 0'59]

HTML icon[Berio-deuxieme-version.mp4, 1'13]

Luciano Berio's "Bela" duet
Public audition, French Senate, June 30, 2007, with music students of the Atelier des Feuillantines

This work is based on the exploration of a timbral space from ordinario to sul tasto. While one of the players uses a given technique or playing mode, the second seems to play a complementary sound. The two violinists seem to control an additive synthesis process by sharing a common spectrum using a subset of Bartok"s favourite playing modes.

In this experiment, a violin student uses the gesture follower in combination with the augmented violin. The applied sound processing technique is variable speed sound playback controlled by the gesture of the student. In parallel, a movie of the teacher playing the second voice is also controlled by the gesture of the student. The piece is played two times with different rubatos.


VIDEO 4

HTML icon [direction.mp4, 4'19]

Second movement of Chopin"s concerto in F minor.
Public audition, French Senate, June 30, 2007, with music students of the Atelier des Feuillantines

Based on a nocturne-like composition scheme, this excerpt of Chopin"s concerto exposes repeatedly a theme with a growing number of ornaments on each of its occurrences. The time and phrasing is constantly rubato, forcing the conductor to follow closely the soloist and adapt to the varying tempo.

In this public auditioning context, the students use the gesture follower in combination with a gesture capture system (same technology as the augmented violin), containing a set of sensors (accelerometers and gyroscopes). ?In a first step, the movements of the conductor are recorded and synchronized with the audio recording of the orchestra. In a second step, the recording is controlled in real-time by the conducting student who follows the pianist. The gesture controls a variable speed sound playback. The overall musical quality depends on both skills: the conductor may vary his beating gesture smoothly transforming the rhythmic frame. This experiment follows a first experience made in a solfeggio lesson with the same students, based on an ear training exercise using the same technology.


VIDEO 5

HTML icon [canons-performance.mp4, 10'36]

Using i-Maestro technology in a creative context (1): A canon between gesture and sound.
Public audition, French Senate, June 30, 2007, with music students of the Atelier des Feuillantines

Each student of a solfeggio class has created a musical sequence based on his name, following Bach and other composer's method. This creation involves five students: three instrumentalists (piano, violin and cello) and two students using sensors modules, and one student with a microphone.

The work is semi-improvised following these steps:

  • An instrument improvises on a given sequence composed on from the name of another student
  • The student holding the microphone catches the sound of this improvisation
  • A student with sensors controls a loop replay and treatment process by imitatating the gestures and movements of the instrument from which the sequence has been recorded
  • The instrumentalist who hears the way his sound is altered by the sensor-controlled movements tries to imitate the modified sequence, which is again recorded and so on.
  • The process is iterated between each possible combination of instrument players and sensors players.

The pedagogical issue The student holding sensors has to use and develop a accurate observation and some kind of analysis of the movement he is going to imitate. This situation may lead to a better imitation of the teacher showing right movements in a regular lesson. The student playing his instrument must focus his perception on the separation between his original sound and the treatment applied on it, developing ear training-like skills. After having perceived a given effect, having to imitate this effect allows him to explore his instrument's capabilities in a mode not related to a given piece, yet in an organised way.

The musical issue The particular combination between imitative rules, improvisation and gestures among students leads to a converging musical intention. After a number of interactions between the students, an interesting musical convergence arises.


ADDITIONAL VIDEOS

HTML icon [Pia-Chopin 17-12-01.mp4, 2'26]

Using i-Maestro technology in a creative context (2): Showing structural aspects of a composition aligned with its real time performance.
PPublic audition, French Senate, June 30, 2007, with music students of the Atelier des Feuillantines

The E minor prelude Op.28 by Frederic Chopin is based on a specific gesture involving gravity and adhering. Each hand seems to mimic two different slow falling movements: the right hand, much slower than the left, hesitates inside a small descending interval. The left hand repeatedly plays chords continuing certain notes while other notes move downwards. The continued notes are then drawn down until a climax point corresponding to the dominant.

A visual rendering of both the structure of the work and its interpretation has been created using the video projection of an image of Chopin's draft of the piece. At the beginning of the performance, this image appears completely covered with dark water making the manuscript invisible. While the student plays, the pitch and amplitude analysed in real time controls the movement of the water. While the water moves, it uncovers the manuscript progressively and making it completely readable at the moment of the climax of the piece. Depending on the way the prelude is played, the movement of the water uncovers the manuscript in different ways and speed. This poetic form of restitution illustrates a compositional phenomena revealed by the gesture of the player. The prelude is interpreted by a student at the end of the second year of piano classes.


Sound and Gesture Flyer

HTML iconDownload Sound and Gesture Flyer (pdf)

Quick Guide

HTML iconQuickGuide to the S&GLab Scenarios (pdf)