Meta

Fostering Metaverse Innovation at Canadian Labs

By Daniel Wigdor, Director of Research Science at Reality Labs Research

Today, we’re announcing 17 Canadian research labs each receiving $30,000 (CAD) unrestricted grants from Reality Labs Research, for a total of $510,000. The grants are awarded to labs working on research that will advance innovations needed to build for the metaverse. Unrestricted grants provide autonomy and flexibility for labs to pursue their mission, and once published, their research becomes publicly accessible to drive further innovation across the industry.

The metaverse is the next evolution in social technologies and the successor to the mobile internet. It will be made up of digital spaces, including immersive 3D experiences, that are all interconnected so you can easily move between them, and has the potential to unlock access to new creative, social and economic opportunities. In the future, you’ll be able to access the metaverse from various devices, including VR headsets and AR glasses.

Getting to these next generation platforms is going to take multiple major technological breakthroughs and collaboration with researchers, industry partners, creators, and others. Canadians are already building new technologies for the metaverse and shaping how we come together in these digital spaces with industry-leading human-computer interaction (HCI) and artificial intelligence (AI) research. Supporting this research with unrestricted grants will assist the entire industry in developing an interconnected and interoperable metaverse, and move us all closer toward building the future of the internet together.

The following Canadian researchers are receiving $30,000 unrestricted grants from Reality Labs Research:

  • Fanny Chevalier, DGP Lab, University of Toronto
  • Parmit Chilana, Interactive Experiences Lab, Simon Fraser University
  • Christopher Collins, Visualization for Information Analysis Lab, Ontario Tech University
  • Jeremy Cooperstock, Shared Reality Lab, McGill University
  • Audrey Durand, Christian Gagné, Denis Laurendeau, Jean-François Lalonde, IID – Institut Intelligence and Data, Université Laval
  • Paul H. Dietz, University of Toronto
  • Steve Engels, University of Toronto
  • Carl Gutwin, Human-Computer Interaction Lab, University of Saskatchewan
  • Yumiko Sakamoto & Pourang Irani, University of British Columbia
  • Regan Mandryk, Interaction Lab, University of Saskatchewan
  • Joanna McGrenere, eDapt and Designing For People, University of British Columbia
  • Dr. Alexis Morris, Adaptive Context Environment (ACE) Lab, OCAD University
  • Dr. Rita Orji, Persuasive Computing Lab, Dalhousie University
  • Tony Tang, RICELab, University of Toronto
  • Khai Truong, Ubicomp, University of Toronto
  • Daniel Vogel, Human-Computer Interaction Lab, University of Waterloo
  • Jian Zhao, WaterlooHCI lab, University of Waterloo

Here’s what some recipients are saying about the grants and the work they’ll support:

“At the Persuasive Computing Lab, we research and design interactive technologies that empower people and improve lives. The unrestricted funding from Reality Labs Research will directly support our work on designing user-adaptive and personalized interactive systems that integrate into people’s daily lives and support them in achieving various self-improvement goals, with a focus on improving health and wellness objectives.”

– Dr. Rita Orji, Persuasive Computing Lab, Dalhousie University

 

“There is a core group of researchers at Université Laval that are very active at the intersection of artificial intelligence, computer vision, and virtual reality. The funds obtained through this donation from Meta will allow us to acquire new equipment and infrastructure that will benefit our research activities and strengthen the works of this core group.”

– Christian Gagné, IID – Institute Intelligence and Data, Université Laval

 

“I am thrilled that Meta’s Reality Labs Research has generously supported our Adaptive Context Environments Lab research. This is a validation of our core directions in mixed-reality based internet-of-things (IoT) systems, as we move toward building new smart-space applications and hybrid environments filled with hyper-connected virtual-physical objects, expressive and immersive IoT-avatars, and context-aware artificial intelligence assistants. This support gives us resources to further our mission to advance designs that transform, enhance, and uplift the human experience of living in the everyday metaverse ecosystem that is about to unfold before our very eyes.”

– Dr. Alexis Morris, Adaptive Context Environment Lab, OCAD University

 

“There is immense potential in augmenting the real world or creating an entirely virtual world to allow people to see things that they couldn’t see otherwise, or do things that they couldn’t do otherwise, like getting immediate visual feedback and guidance on how to operate a machine or perform a yoga sequence. What information is relevant to display and how it should be displayed is still unknown, though, which is what our research studies. The generation of new knowledge, methods, and tools will inform how to best leverage immersive technologies for personalized and timely guidance when people perform complex physical movements, anytime and anywhere desired and useful.”

– Fanny Chevalier, DGP Lab, University of Toronto


:

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy