Jeremy R. Cooperstock - Director, Shared Reality Lab
jeremy
Photo credit: M. Mostyn

My lab is broadly concerned with human-computer interaction technologies, emphasizing multimodal sensory augmentation for communication in both co-present and distributed contexts. Our research tackles the full pipeline of sensory input, analysis, encoding, data distribution, and rendering, as well as interaction capabilities and quality of user experience. Applications of these efforts include distributed training of medical and music students, augmented environmental awareness for the blind community, treatment of lazy eye syndrome, low-latency uncompressed HD videoconferencing and a variety of multimodal immersive simulation experiences. Most of our research takes place within the Shared Reality Environment, a facility that includes two different configurations of multi-projector displays, camera and loudspeaker arrays, and a high-fidelity vibrotactile sensing and actuated floor.

BIO · Jeremy Cooperstock (Ph.D., University of Toronto, 1996) is a professor in the department of Electrical and Computer Engineering, a member of the Centre for Intelligent Machines, and a founding member of the Centre for Interdisciplinary Research in Music Media and Technology at McGill University. He directs the Shared Reality Lab, which focuses on computer mediation to facilitate high-fidelity human communication and the synthesis of perceptually engaging, multimodal, immersive environments. He led the development of the Intelligent Classroom, the world's first Internet streaming demonstrations of Dolby Digital 5.1, multiple simultaneous streams of uncompressed high-definition video, a high-fidelity orchestra rehearsal simulator, a simulation environment that renders graphic, audio, and vibrotactile effects in response to footsteps, and a mobile game treatment for amblyopia. Cooperstock's work on the Ultra-Videoconferencing system was recognized by an award for Most Innovative Use of New Technology from ACM/IEEE Supercomputing and a Distinction Award from the Audio Engineering Society. The research he supervised on the Autour project earned the Hochhausen Research Award from the Canadian National Institute for the Blind and an Impact Award from the Canadian Internet Registry Association, and his Real-Time Emergency Response project won the Gold Prize (brainstorm round) of the Mozilla Ignite Challenge. Cooperstock has worked with IBM at the Haifa Research Center, Israel, and the T.J. Watson Research Center in Yorktown Heights, New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was a visiting professor at Bang & Olufsen, Denmark, where he conducted research on telepresence technologies as part of the World Opera Project. Cooperstock led the theme of Enabling Technologies for a Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND) and is currently an associate editor of IEEE Transactions on Haptics, Frontiers Journal in Virtual Reality (Specialty section on haptics), IEEE World Haptics Conference, IEEE Haptics Symposium, and from 2008 to 2022, Journal of the Audio Engineering Society, as well as editor of Multimodal Technologies and Interaction for a special issue on Multimodal Medical Alarms. (FULL CV AVAILABLE)

research
IMAGE: Interactive Multimodal Access for Graphics Exploration uses rich audio (sonification) together with the sense of touch (haptics) to provide a faster and more nuanced experience of graphics on the web for people who are blind, low-vision, or deaf-blind.
Autour is an eyes-free mobile system designed to give blind users a better sense of their surroundings. We are presently adding new functionality to support intersection crossing, indoor exploration, and dialogue, as well as advancing towards an Android release of the platform.
Simulation and Synthesis of Multi-Modal Hallucinations involves designing a VR environment to address a number of open problems related to avatar therapy for schizophrenia. To date, we have developed several novel ML-based tools to help patients design avatars reflecting an accurate graphical appearance and voice of their hallucinations.
Mimic: We use vibrotactile patterns to convey activity-related information between two people on an ongoing basis as part of a mobile remote implicit communication system.
Multimodal Medical Alarms: To reduce the problem of auditory sensory overload in the clinical environment, we are exploring the use of a multimodal alarm system in operating rooms and intensive care units.
MORE PROJECTS
"I do remember students who took your classes. They were clearly divided as those who complained saying that you were a hard grader and expected them to do work, and those who were appreciative for the hard work and what they learned. All those who were willing to do work thought you were a great educator."
fall 2023ECSE424/542 · Human-Computer Interation
Monday & Wednesday, 8:35 – 9:55 ·  ENGMC 11
ECSE526 · Artificial Intelligence
Monday & Wednesday, 11:35 – 12:55 ·  FERR 456
winter 2022ECSE421 · Embedded Systems
Tuesday & Thursday, 8:35 – 9:55 ·  ENGTR 2100
ECSE618 · Haptics
Monday, 13:05 – 15:55 ·  Online
EARLIER COURSES
videos


UltraVideo

EcoTile
Natural Interactive Walking

real-time
Emergency Response

mosaicing

interpolation

projection