Meta

Accelerating the Future: AI, Mixed Reality and the Metaverse

When we first began giving demos of Orion early this year I was reminded of a line that you hear a lot at Meta – in fact it was even in our first letter to prospective shareholders back in 2012: Code wins arguments. We probably learned as much about this product space from a few months of real-life demos than we did from the years of work it took to make them. There is just no substitute for actually building something, putting it in people’s hands and learning from how they react to it. 

Orion wasn’t our only example of that this year. Our mixed reality hardware and AI glasses have both reached a new level of quality and accessibility. The stability of those platforms allows our software developers to move much faster on everything from operating systems to new AI features. This is how I see the metaverse starting to come into greater focus and why I’m so confident that the coming year will be the most important one in the history of Reality Labs. 

2024 was the year AI glasses hit their stride. When we first started making smart glasses in 2021 we thought they could be a nice first step toward the AR glasses we eventually wanted to build. While mixed reality headsets are on track to becoming a general purpose computing platform much like today’s PCs, we saw glasses as the natural evolution of today’s mobile computing platforms. So we wanted to begin learning from the real world as soon as possible. 

The biggest thing we’ve learned is that glasses are by far the best form factor for a truly AI-native device. In fact they might be the first hardware category to be completely defined by AI from the beginning. For many people, glasses are the place where an AI assistant makes the most sense, especially when it’s a multimodal system that can truly understand the world around you. 

We’re right at the beginning of the S-curve for this entire product category, and there are endless opportunities ahead. One of the things I’m most excited about for 2025 is the evolution of AI assistants into tools that don’t just respond to a prompt when you ask for help but can become a proactive helper as you go about your day. At Connect we showed how Live AI on glasses can become more of a real-time participant when you’re getting things done. As this feature begins rolling out in early access this month we’ll see the first step toward a new kind of personalized AI assistant. 

It won’t be the last. We’re currently in the middle of an industry-wide push to make AI-native hardware. You’re seeing phone and PC makers scramble to rebuild their products to put AI assistants at their core. But I think a much bigger opportunity is to make devices that are AI-native from the start, and I’m confident that glasses are going to be the first to get there. Meta’s Chief Research Scientist Michael Abrash has been talking about the potential of a personalized, context-aware AI assistant on glasses for many years, and building the technologies that make it possible has been a huge focus of our research teams.

Mixed reality has been another place where having the right product in the hands of a lot of people has been a major accelerant for progress. We’ve seen Meta Quest 3 get better month after month as we continue to iterate on the core system passthrough, multitasking, spatial user interfaces and more. All these gains extended to Quest 3S the moment it launched. This meant the $299 Quest 3S was in many respects a better headset on day 1 than the $499 Quest 3 was when it first launched in 2023. And the entire Quest 3 family keeps getting better with every update

In 2025 we’ll see the next iteration of this as Quest 3S brings a lot more people into mixed reality for the first time. While Quest 3 has been a hit among people excited to own the very best device on the market, we’re seeing that Quest 3S is our most giftable headset yet. Sales were strong over the Black Friday weekend, and we’re expecting a surge of people activating their new headsets over the holiday break. 

This will continue a trend that took shape over the last year: growth in new users who are seeking out a wider range of things to do with their headsets. We want these new users to see the magic of MR and stick around for the long run, which is why we’re funding developers to build the new types of apps and games they’re looking for. 

There’s been a significant influx of younger people in particular, and they have been gravitating toward social and competitive multiplayer games, as well as freemium content. And titles like Skydance’s BEHEMOTH, Batman: Arkham Shadow (which just won Best AR/VR game at The Game Awards!), and Metro Awakening showed once again that some of the best new games our industry produces are now only possible on today’s headsets.  

As the number of people in MR grows, the quality of the social experience it can deliver is growing in tandem. This is at the heart of what Meta is trying to achieve with Reality Labs and where the greatest potential of the metaverse will be unlocked: “the chance to create the most social platform ever” is how we described it when we first began working on it. We took two steps forward on this front in 2024: first with a broad set of improvements to Horizon Worlds including its expansion to mobile, and second with the next-generation Meta Avatars system that lets people represent themselves across our apps and headsets. And as the visual quality and overall experience with these systems improve, more people are getting their first glimpse of a social metaverse. We’re seeing similar trends with new Quest 3S users spending more time in Horizon Worlds, making it a Top 3 immersive app for Quest 3S, and people continue creating new Meta Avatars across mobile and MR. 

Getting mixed reality into the mainstream has helped illuminate what will come next. One of the first trends that we discovered after the launch of Quest 3 last year was people using mixed reality to watch videos while multitasking in their home—doing the dishes or vacuuming their living rooms. This was an early signal that people love having a big virtual screen that you can take anywhere and place in the physical world around you. We’ve seen that trend take off with all sorts of entertainment experiences growing fast across the whole Quest 3 family. New features like YouTube Co-Watch show how much potential there is for a whole new kind of social entertainment experience in the metaverse. 

That’s why James Cameron, one the most technologically innovative storytellers of our lifetimes, is now working to help more filmmakers and creators produce great 3D content for Meta Quest. While 3D films have been produced for decades, there’s never been a way to view them that’s quite as good as an MR headset, and next year more people will own a headset than ever before. “We’re at a true, historic inflection point, Jim said when we launched our new partnership this month.

This is happening alongside a larger shift Mark Rabkin shared at Connect this year: Our vision for Horizon OS is to build a new kind of general purpose computing platform capable of running every kind of software, supporting every kind of user, and open to every kind of creator and developer. Our recent releases for 2D/3D multi-tasking, panel positioning, better hand tracking, Windows Remote Desktop integration and Open Store have started to build momentum along this new path. Horizon OS is on track to be the first platform that supports the full spectrum of experiences from immersive VR to 2D screens, mobile apps and virtual desktops – and the developer community is central to that success. 

The next big step toward the metaverse will be combining AI glasses with the kind of true augmented reality experience we revealed this year with Orion. It’s not often that you get a glimpse of the future and see a totally new technology that shows you where things are heading. The people who saw Orion immediately understood what it meant for the future much like the people who saw first personal computers taking shape at Xerox PARC in the 1970s (“within ten minutes it was obvious to me that all computers would work like this someday,” Steve Jobs later said of his 1979 demo of the Xerox Alto). 

Being able to put people in a time machine and show them how the next computing platform will look was a highlight of 2024 – and of my career so far. But the real impact of Orion will be in the products we ship next and the ways it helps us better understand what people love about AR glasses and what needs to get better. We spent years working on user research, product planning exercises, and experimental studies trying to understand how AR glasses should work, and that work is what enabled us to build Orion. But the pace of progress will be much more rapid from here on out now that we have a real product to build our intuition around. 

This has been the lesson time and again over the last decade at Reality Labs. The most important thing you can do when you’re trying to invent the future is to ship things and learn from how real people use them. They won’t always be immediate smash hits, but they’ll always teach you something. And when you land on things that really hit the mark, like mixed reality on Quest 3 or AI on glasses, that’s when you put your foot on the gas. This is what will make 2025 such a special year: With the right devices on the market, people experiencing them for the first time and developers discovering all the opportunities ahead, it’s time to accelerate.