Oct 16-19, 2016
Congratulations to Jeff and Pascal for their entry, which presented not only a compelling demo of electro-muscular stimulation, but did so in the context of an intriguing research question. More to come soon!
Sep 21-23, 2016
Our lab presented a paper, a demo, and participated in a panel session on VR and AR at the IEEE Workshop on Multimedia Signal Processing.
Sep 1, 2016
new students
We are delighted to welcome our new students, Taeyong Kim, Parisa Alirezaee, and Roger Gigris to the lab, and to have Pascal Fortin fast-tracking to the Ph.D. program.
Aug 26, 2016
With support from the National Academies of Sciences, Engineering and Medicine, our lab joins collaborators from UT-Austin, Drexel, and Stanford, in building technology that will allow the user to feel what others feel.
Jul 29, 2016
Our app to support environmental awareness for the blind and visually impaired is launched across Canada, and featured on CTV and TVA news.
About us
The goal of Shared Reality is to achieve high-fidelity distributed interaction, with both real and virtual data, at levels of presence that support the most demanding applications, and to do so in spite of sensor and bandwidth limitations. Our lab works with audio, video, and haptic technologies, building systems that leverage their capabilities to facilitate and enrich both human-computer and computer-mediated human-human interaction.
For questions about the lab, please contact Prof. Jeremy Cooperstock.
Mozilla Ignite Google Research HP Labs
Credit for the CSS contents of this web site to the GRAND NCE. Credit for the SRL logo to M.Eng. student Naoto Hieda.