While the physical world around us is amazing, it's the combination of our physical and digital worlds that defines our reality.
Yet too often, we rely on screens to tap into virtual spaces and content — and that can pull us away from the moment and the people we're physically with.
Imagine replacing your TV with a fully customisable home theatre you can take with you anywhere. Or imagine sitting around a table with friends — some of whom are in the same physical room while others are avatars of those who live miles away — and yet still feeling fully present in the same space and experience.
Or, imagine a meeting where some of the avatars are embodied AIs that can help you accomplish a variety of tasks.
All of that could be possible when our physical and virtual worlds come together seamlessly.
Many of the foundational technologies needed to make that vision a reality are either already here or on the way. Mixed reality (MR) lets us bring digital objects into the physical world or create fully immersive experiences to explore.
Advances in AI let us create unique characters that we can interact with in different ways. Future Augmented Reality (AR) glasses will eventually bring these technologies together in a stylish form factor.
Zuckerberg shared Meta's progress on all three fronts.
Mixed reality goes mainstream with Meta Quest 3
Meta Quest 3 is the world's first mainstream headset built for mixed reality — and Meta's most powerful headset yet. With double the graphic processing power of Quest 2, Quest 3 is also the world's first device to feature the new Snapdragon XR2 Gen 2 platform Meta helped develop in collaboration with
Qualcomm Technologies.
It is completely standalone: no PC, no console, no battery packs — nothing to break the feeling of presence. It understands your physical space, so you can play with the world around you. With mixed reality, the limits of your physical space can expand, and you can be part of a much larger world — like opening a portal to the 'Upside Down' from your living room in
Stranger Things Virtual Reality (VR).
With Quest 3, you can immerse yourself in an
expansive content library with games and experiences to suit every taste and mood. More than 100 new and upgraded titles are coming to Quest 3 in 2023, and many of them will incorporate mixed reality.
Because Quest 3 is backwards-compatible, you get access to a vast library of more than 500 VR and MR experiences on day one. And with
Xbox Cloud Gaming coming to Quest in December, you'll be able to play:
- Halo Infinite
- Minecraft Legends
- Forza Horizon 5, and
- hundreds of other high-quality Xbox games.
All of these can be played on a massive 2D screen you can take with you anywhere.
Starting at USD$499.99 (about R9 600), Quest 3 ships Tuesday, 10 October, and
pre-orders are open now. For all the details, click
here.
While it's been an amazing year for AI, most people today still haven't experienced this new technology firsthand — and Meta has an opportunity to change that by building state-of-the-art AI into apps that billions of people already use.
Meta has talked about the Llama ecosystem of large language models, and today Meta unveiled its image generation model.
Emu (short for Expressive Media Universe) uses your text prompts to generate high-quality, photorealistic images in just seconds. And thanks to Emu and technology from Llama 2, you can create your own custom AI stickers in chat to liven up conversations on the fly.
Meta has also introduced restyle and backdrop — two new features coming soon to Instagram — that use the technology from Emu to let you transform your photos or even co-create AI-generated images with friends.
Restyle lets you reimagine your images by applying the visual styles you describe (you might type out 'watercolour' or 'collage from magazines and newspapers, torn edges', for example), while backdrop leverages learnings from Meta's
Segment Anything Model so you can change your image's scene or background.
Prompts like 'put me in front of a sublime Aurora Borealis or 'surrounded by puppies' will keep your subject in the foreground while creating the background you've described. Images created with restyle and backdrop will indicate the use of AI to help reduce the odds of people mistaking them for human-generated content.
Meta is also experimenting with forms of visible and invisible markers to help people distinguish AI-generated content.
Unlike most others in the industry, Meta doesn't believe there will be one single super-intelligent AI that everyone uses. Rather, Meta thinks you'll want different AIs for different things, like:
- finding information
- communicating
- being entertained
- playing games, and
- helping you get work done.
You might even want to create your own AI that's aligned with your goals, whether you're a small business, a creator or anyone really.
That's why Meta is building AI Studio — a new platform for creating AIs that can help you get things done and just have fun. People will be able to interact with these AIs across the whole Meta universe of products. They'll have profiles on Instagram and Facebook, and you'll be able to chat with them on:
- WhatsApp
- Messenger, and
- Instagram.
Eventually, they'll be embodied as avatars in the metaverse too.
Meta has created some AIs of its own using AI Studio that it will start rolling out in the United States (US) in beta.
Meta AI is a new assistant you can interact with like a person. It uses a custom model based on Llama 2 technology and has access to real-time information through a partnership with Bing Search. And Emul is built into Meta AI, so you can generate high-quality photorealistic images for free in seconds.
Meta has also been creating AIs that have more personality, opinions and interests and are a bit more fun to interact with. This includes:
- Victor, a motivational coach who encourages you to hit your goals
- the Dungeon Master can take you on an old-school text-based adventure, and
- the sous chef Max, who can take the random assortment of ingredients in your pantry and come up with a delicious recipe on the fly.
Those are just a few of the AIs Meta trained so far, and there are several more coming over the next few weeks across a range of interests, from gaming and philosophy to sports, fashion and beyond.
Meta wants to give people the chance to build AIs of their own, so AI Studio will ultimately let developers build third-party AIs for Meta's messaging services. Meta is building a sandbox that will let people who don't code create their own AIs.
Meta is working on a way for creators to build AIs that represent them and help them engage and grow their communities. And making it so businesses can create AIs that interact with customers and help with commerce and support.
Generative AI will bring with it new challenges, so Meta is putting in the
time and effort to make sure it gets this right. That includes training and fine-tuning models to fit Meta's safety and responsibility guidelines, red-teaming with external experts and internal teams to help:
- ensure its models are safer and more inclusive
- programming in guardrails around inappropriate conversations, and
- share system cards publicly so people better understand how these models work.
This next generation of Artificial Intelligence will enable a wide range of experiences and interactions, which will transform how people, businesses and creators use all of Meta's products. Meta will continue innovating to ensure everyone has a chance to participate in the upside. To learn more, click
here.
Introducing the Ray-Ban | Meta smart glasses collection
Built in partnership with
EssilorLuxottica, the next generation of smart glasses are designed so you can stay connected and capture the moment without having to stop and take out your phone — and easily share your experiences with friends, family or the world.
Meta has upgraded them from the first generation in basically every way, and for the first time, you'll be able to live stream directly from your smart glasses to your friends and followers on Facebook and Instagram.
These are also the first smart glasses to ship with Meta AI built in. Starting in the United States in beta, you'll get Meta's state-of-the-art AI hands-free, wherever you are, whatever you're doing, in real-time.
In 2024, Meta will roll out a free update so your smart glasses will be able to understand what you're looking at and help you out. If you want to know what building you're standing in front of or get a translation of a sign on the fly, your Ray-Ban Meta smart glasses will have the answer.
Smart glasses will be an important platform in the future not only because they're a natural way to see digital holograms in the physical world but also because soon you'll be able to let your AI see what you see and hear what you hear — which will make your smart glasses more useful over time.
Starting at USD$299 (about R5 700), the Ray-Ban Meta smart glasses collection will launch on Tuesday, 17 October, and they're available for pre-order today on
meta.com and
ray-ban.com. For more information, click
here.
While Meta is hard at work delivering new technologies the world has never seen before, it is equally focussed on making those advances available for everyone. And innovation to bring the future to millions and eventually billions of people — affordably — is what Meta does best.
There's a long road ahead, but Meta looks forward to traveling it with the developer community and all of you.
For more information, visit
www.meta.com. You can also follow Meta on
Facebook,
X or on
Instagram.