Meta

Meta and BMW: Taking AR and VR Experiences on the Road

Takeaways

  • We’re working with BMW Group to explore how AR and VR technology can work inside a fast-moving vehicle. 
  • In the future, we think technology like this can lead to more productive and fun passenger experiences on the road.

There’s nothing quite like the call of the open road. Whether it’s quality conversation on an epic road trip, enjoying a podcast or your favorite album, just about everyone has some fond memories in cars. But what if we could make our time spent in cars more productive, social and entertaining than ever before?

That question is at the heart of Meta’s partnership with BMW. Announced in 2021, the goal of this joint research project is to explore how augmented and virtual reality (AR and VR) could one day be integrated into smart vehicles to enhance the passenger experience.

If we get it right, this technology could revolutionize travel in cars, trains, planes and beyond, unlocking new forms of hands-free communication, entertainment and utility — giving us far more value than the screens and instruments we’re used to seeing in vehicles today.

A GIF showing a research demo of augmented realities features inside of a BMW.

Oculus Insight vs. the Open Road

VR headsets are equipped with a number of sensors but moving vehicles pose a tricky challenge, because tracking technology like Oculus Insight use both inertial motion sensors (IMUs) and cameras to precisely estimate the headset’s location and motion.

In a moving environment — more precisely, in a non-inertial reference frame — these two modalities are in conflict as the cameras observe motion relative to the inside of the car, while the IMUs measure acceleration and rotational velocity relative to the world. That mismatch means a VR headset like Meta Quest 2 can’t currently display stable virtual content when traveling inside a vehicle as it turns and/or accelerates.

At least, that is, until now.

Challenge Accepted

To solve this problem, we collaborated with BMW to incorporate IMU data from a BMW car’s sensor array in real time into the tracking system of our Project Aria research glasses. This additional information allows the system to calculate the glasses’ location relative to the car.

That was a huge feat because after transferring the tracking system to a Meta Quest Pro, it allowed us to accurately anchor virtual objects to a moving car using a digital twin of the car. We’ve been able to demo some compelling virtual and mixed reality passenger experiences in moving cars using this new tracking system and Meta Quest Pro. The next step will be to add the car’s location relative to the world, which would enable world-locked rendering.

Going forward, we hope to continue working with BMW to further leverage the growing machine perception capabilities of modern cars to enable future use cases. Access to the car’s precise 6DOF positioning system could allow us to render world-locked virtual content outside of the vehicle, like identifying landmarks and other points of interest. We expect this capability to be invaluable for future AR glasses and personalized AI assistants.

It’s an exciting vision, and it’s a road we’re committed to traveling.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy