Evaluation of Affective User Experience

affective/physiological computingcomputer visionvisualization

This research program originated in an NSERC/Canada Council for the Arts New Media Initiative, continued in collaborations with Jordan Deitcher, Hal Myers (Thought Technology), Stephen McAdams (Schulich School of Music) and Robert Zatorre (Montreal Neurological Institute), and is now being funded by a Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND).

Participants include Regan Mandryk (Saskatoon), Alissa Antle (SFU), Magy Seif El-Nasr (SFU), Bernhard Riecke (SFU), Gitte Lindgard (Carleton), Heather O’Brien (UBC), Karon MacLean (UBC), Frank Russo (Ryerson), and Diane Gromala (SFU).

Project goal

The goal of this project is to develop and validate a suite of reliable, valid, and robust quantitative and qualitative, objective and subjective evaluation methods for computer game-, new media-, and animation environments that address the unique challenges of these technologies.

Our work at McGill spans biological and neurological processes involved in human psychological and physiological states, pattern recognition of biosignals for automatic psychophysiological state recognition, biologically inspired computer vision for automatic facial expression recognition, physiological responses to music, and stress/anxiety measurement using physiological data.

Biosignals analysis for automatic psychophysiological state recognition

Our interest in the use of biosignals began with explorations of live emotional mapping to a multimedia display, providing an external and directly accessible manifestation of the individual’s emotions.

Applications include augmented theatrical performance, therapy, or videoconference communication.
The representation may be either connected with social conventions (e.g., anger shown as red, calm as waves) or more abstract (storm clouds, gentle sounds).

This work was also applied to characterizing individuals’ excitement and stress levels in activities ranging from opera singing to computer interaction.

Related Publications (biosignals)

Psychophysiological and Neurological Correlates of Chilling Music

We applied biosignals analysis to study the characterization of intensely pleasurable feelings induced by music, comparable to other rewarding stimuli.

This work investigated:

  • The body’s physiological reactions to pleasurable music
  • Whether reward areas of the brain are activated in response to music
  • Whether music alone can trigger release of rewarding neurochemicals.

Related Publications (music & neuro)

  • Benovoy, M., Salimpoor, V.N., Longo, G., Zatorre, R.J., and Cooperstock, J.R.
    Pleasurable Affective State Recognition Using Non-Linear Feature Transformation, submitted to IEEE Transactions on Affective Computing.
  • Salimpoor, V.N., Benovoy, M., Longo, G., Larcher, K., Dagher, A., Cooperstock, J.R., and Zatorre, R.J. (2009).
    The Rewarding Aspects of Music Listening Involve the Dopaminergic Striatal Reward Systems of the Brain: An Investigation with [C^11]Raclopride PET and fMRI,
    OHBM, San Francisco, June 18–23.
  • Salimpoor, V., Benovoy, M., Longo, G., Cooperstock, J.R., and Zatorre, R.J. (2009).
    The Rewarding Aspects of Music Listening are Related to Degree of Emotional Arousal,
    PLoS ONE, 4(10): e7487.

Recognition of Facial Expressions

Since biosignal sensors are not always appropriate, we also investigated video-based analysis of facial expressions.

Applications include autism spectrum disorder (ASD) treatment, where individuals often struggle to express feelings or recognize emotions in others. Expressivity training through recognition of facial expressions can be supported with feedback systems.

Using recognition of the five prototypical facial expressions (surprise, happy, serious, sad, disgust), we created an “emotional mirror” where a character such as Sponge Bob mimics the user’s expression in real time.

Last update

Last update: 18 August 2010