We are pleased to announce the following positions for postdoctoral researchers in the Shared Reality Lab at McGill University. Instructions to apply can be found below.
These positions will explore the design space for wearable haptics as a platform for human-computer interaction, supporting improved situational awareness, both in everyday conditions and critical environments. Our objectives are to facilitate novel forms of information delivery for communication, training, and education, and enhance user experience in entertainment and mixed reality simulations. Our team works on architectures to manage sensing and actuation, the design of haptic feedback strategies tailored to the demands of specific users, and evaluation of the perceptual response to such stimuli, employing biosensing techniques. Example applications include haptic-instrumented shoes to improve dance training or rehabilitation therapy, haptic feedback strategies to convey vital signs of patients to clinicians, immersive multimodal interaction with simulated ground surfaces, and mobile implicit communication between individuals.
We are seeking candidates who will participate in one or more areas of this research program, including the design and testing of wearable and haptic-enabled systems, investigating the information delivery capacity of various architectures, and conducting studies related to human perception of the stimuli. Preference will be given to applicants with demonstrated experience in developing innovative haptic systems and/or interaction solutions to real-world user needs, enthusiasm for mentoring students, and a solid track record of publications in top haptics or HCI venues. Successful candidates will contribute to various aspects of the project, help supervise and mentor graduate students, play a major role in the publication of results of this research, and will also have the option to develop and pursue additional projects.
We are seeking strong candidates experienced in mobile software development, and interested in assistive technologies to join our team. This project, funded by an NSERC grant, and in collaboration with Immervision, aims to leverage the benefits of head-worn, panoramic imaging systems to provide navigation assistance for the visually impaired community, 1) safely guiding users during intersection crossing to avoid veering, which can be dangerous and stressful, and 2) helping them navigate the last few meters to doorways they wish to enter. Our proposed approach combines a machine learning strategy leveraging existing image datasets, possibly augmented by crowdsourcing, and iterative design of the feedback mechanisms. This is informed by the team's experience with sensor-based intersection- crossing assistance systems, and in developing the Autour app, which provides a real-time description of street intersections, public transport data, and points of interest in the user's vicinity.
The successful candidate will lead the research and development activities, mentor students, supervise user tests with members of the visually impaired community, and contribute to publications related to the project. Preference will be given to applicants having demonstrated experience in machine learning, crowdsourcing, and with a strong publication record.
For this position, we are seeking strong candidates experienced in XR and multimodal interaction design, speech processing, or biosignals analysis. The project, funded by NSERC and MEDTEQ, an accelerator for the Canadian medical technologies industry, and in collaboration with two Montreal-based medical companies, aims to design, implement, and test a mixed-reality avatar therapy platform to reduce the distress and helplessness associated with auditory hallucinations. This will allow therapists to explore the requirements for optimal delivery of such treatment by adjusting various parameters of the avatars. Ultimately, our hope is to apply the knowledge gained from this exploration to an augmented reality version of the platform suitable for use outside of the therapist's office, offering the possibility of providing therapeutic benefits to patients in their day-to-day activities.
The successful candidate will have primary responsibility for the overall system architecture design, with support for rendering multimodal stimuli, and will mentor students several graduate students also working on the project. Preference will be given to applicants with a proven track record, demonstrating relevant skills in these domains through prior doctoral research, and with a strong publication record.
To apply for any of the above positions, please email the following items to Jeremy Cooperstock:
The positions are available immediately, with a reasonably flexible start date, and an initial appointment of up to one year, with an option to renew for an additional year, subject to mutual satisfaction and availability of funding. Informal inquiries are welcome.
About us: The Shared Reality Lab conducts research in audio, video, and haptic technologies, building systems that leverage their capabilities to facilitate and enrich both human-computer and computer-mediated human-human interaction. The lab is part of the Centre for Intelligent Machines and Department of Electrical and Computer Engineering of McGill University. McGill, one of Canada's most prestigious universities, is located in Montreal, a top city to live in, especially for students.
McGill University is committed to equity in employment and diversity. It welcomes applications from women, Aboriginal persons, persons with disabilities, ethnic minorities, persons of minority sexual orientation or gender identity, visible minorities, and others who may contribute to further diversification. In Quebec, "Postdoctoral Fellow" is a regulated category of trainee. Notably, a postdoctoral candidate must be within five years of graduating with a Ph.D. For more information, please consult www.mcgill.ca/gps/postdocs/fellows.