Master 2013 2014
Stages de la spécialité SAR
Motion Variation Tracking for Expressive Interaction Design


Site :Embodied Audio-Visual Interaction group
Lieu :Goldsmiths, University of London New Cross, London SE14 6NW
Encadrant : Dr. Baptiste Caramiaux, Prof. Atau Tanaka
Dates :du 03/03/2014 au 03/08/2014
Rémunération :Oui, rémunération selon la réglementation de Goldsmiths, Université de Londres
Mots-clés : Parcours ATIAM : Acoustique, Parcours ATIAM : Informatique musicale, Parcours ATIAM : Traitement du signal

Description

>> Introduction

Body movements and gestures are a very efficient medium for non-verbal interaction in a one-to-one or one-to-many communication as well as for interacting with the computer. While human-human interaction involves very rich, complex and nuanced gestures, gestures involved in human-computer interaction remain rather simplistic (2-dimensional shapes, 3-dimensional postures, etc.). An interesting context where gestures involved in the human-computer interaction are deliberately nuanced, is musical performance with digital interfaces for musical expression (NIME). In this particular context, gesture expressivity plays a predominant role. However, the often-noted drawbacks are, on the one hand, the idiosyncratic nature of interaction design (based on a performer’s personal gesture vocabulary) and, on the other hand, the heuristic approach of processing gestural data to extract expressive content.

In our research, the approach has been to design tools that capture expressive content of gestures, defined as dynamic variations during the gesture execution. We developed tools that are able to recognize the gesture (which gesture is performed) and track its temporal execution and variations (how the gesture is performed). These two concurrent aspects (recognition, adaptation) are critical for the applications considered.

As an example, an on-going project is to track pianist’s live gesture variations between performances and use this information to manipulate sound and effects. The project is a standalone application embedding movement analysis, machine learning and audio processing. The applicative context of this project is both pedagogical and performative.

>> Internship

The internship will focus on the implementation of a gesture variation tracking system based on a probabilistic model. An existing system makes use of Particle Filtering for gesture recognition and tracking of gesture variation (such as speed, size, orientation, etc.). The system is template-based which means that it relies on only one gesture example for recognition and variations estimation. While this system has been shown to be efficient to estimate variations in simples 2-dimensional and 3-dimensional shapes, limitations occur when considering more complex gesture, such as the one used in musical performance : variations are not isotropic (not the equal in every dimension) and highly dynamic.

The goal of the internship is to implement various strategies for a more robust and more accurate estimation of gesture variation in performance. Implemented techniques will be tested on existing databases and live musical performances.

The first task will the implementation of a variable memory filter based on exponential weighted averaging (on the online estimates) which will lead to estimations more robust against artefacts such as noise for more accurate prediction.

A second task will be to implement a modified version of Expectation-Maximization (EM) algorithm as a means to optimise the parameterization of the recognition algorithm, based on a small number of examples.

The results from the internship are meant to lead to a conference and/or journal publication.

>> Requirements

• Theory : Notions of Machine Learning mechanisms, Signal Processing • Software : Matlab, C++, scripting language (python, shell) • The internship will take place in Goldsmiths, University of London

>> Context

This internship will take place in the Embodied Audio-visual Interaction group in the Department of Computing, Goldsmiths, University of London (http://eavi.goldsmithsdigital.com).

It is funded by the European Research Council project, MetaGesture Music (MGM) and is supervised by Dr Baptiste Caramiaux and Prof. Atau Tanaka. The research project will be defined in the context of the larger MGM project, and the internship outputs must contribute, in publications and knowledge generated, to MGM. 

The internship will be paid based on the scale defined by Goldsmiths internship program. An accommodation at a reasonable price, on the Goldsmiths Campus, is possible.

Bibliographie

1. B. Caramiaux, F. Bevilacqua, A. Tanaka. Beyond Recognition : Using Gesture Variation for Continuous Interaction. ACM CHI’13, altchi. pp. 2109-2118. Paris, France. 2013 2. B. Caramiaux. Studies on the Relationship between Gesture and Sound in Musical Performance. PhD Thesis, Université Paris VI, Ircam - Centre Pompidou. 2012 3. J. Françoise, N. Schnell, and F. Bevilacqua. "A multimodal probabilistic model for gesture—based control of sound synthesis." Proceedings of the 21st ACM international conference on Multimedia. ACM, 2013.