It’s Meta Connect day, where Meta shared numerous updates on its cutting-edge technological advancements, such as the new Quest VR headsets, AR glasses, an upgraded Llama AI model, next-gen AI features in its apps, and more.
There’s a lot to unpack and consider. Here’s a summary of the major announcements from Meta’s latest event.
Orion AR Glasses
The highlight was Meta’s debut AR glasses, named “Orion”, which have been in development for the past five years.
Meta claims that these are the "most advanced AR glasses ever created," the product of intricate work to miniaturize the necessary components into a size small enough to be worn like regular glasses.
As shown in the image above, the Orion glasses currently include a wrist controller for "subtle, socially acceptable input" and a wireless battery pack.
Meta also states that Orion offers "the largest field of view in the smallest AR glasses form to date," allowing users to enjoy a variety of immersive experiences, such as "multitasking windows, big-screen entertainment, and life-size holograms of people."
The glasses are bulkier than standard sunglasses and significantly more so than Meta's own Ray-Ban smart glasses. However, Meta had to fit a great deal of technology into a compact frame.
Wearing the glasses might make you look like a quirky fashion designer, though you probably won’t be sporting these just yet.
Meta isn’t releasing its AR glasses to the public at this time, but it is distributing them to selected developers and internal Meta staff for testing.
These prototypes are more about showcasing how far the technology has come, and they may not represent the exact form Meta will release in the coming years.
The consumer version could end up looking sleeker and more fashionable, thanks to Meta’s collaboration with EssilorLuxottica, the maker of Ray-Ban.
Ultimately, the key point of Meta’s presentation was to emphasize that neither Apple with its Vision Pro nor Snap with its latest AR glasses is surpassing Meta in this space.
Meta’s AR glasses are set to be more advanced than Snap’s and likely more practical and affordable than Apple’s version.
We’re not there yet, but Meta’s AR journey is progressing rapidly.
**New Features for Ray-Ban Meta Glasses**
Speaking of Meta’s Ray-Ban smart glasses, they’re also receiving upgrades. These include improved voice commands, allowing you to chat with Meta AI without having to repeatedly say “Hey Meta,” as well as the ability to record and send voice messages while on the move.
Also:
“We’re adding a feature that helps your glasses remember things for you. The next time you fly, you won’t have to worry about forgetting where you parked at the airport—your glasses will save your spot in long-term parking.”
This could be especially useful, and Meta is also introducing a new translation feature that listens to the language being spoken around you and translates it into English through the open-ear speakers.
Now, you’ll be able to figure out if people are talking behind your back in another language, or if you’re just imagining it.
**Quest 3S**
Meta has also revealed its newest Quest VR headset, the Quest 3S, which offers the same features as the Quest 3 but at a more affordable price.
“Starting at just $299.99 USD, Quest 3S is the best headset for newcomers to mixed reality and immersive experiences, or for those who’ve been waiting for an affordable upgrade from the Quest or Quest 2.”
Meta has announced that it has rebuilt its Horizon OS, now offering better support for key 2D apps like YouTube, Facebook, and Instagram. It has also enhanced the spatial audio Passthrough features.
Essentially, this is a more powerful version of Meta’s top VR headset, but at a lower price, which is crucial for boosting adoption.
In fact, Meta is also cutting the price of the 512GB Meta Quest 3 by $150, bringing it down to $499.99 USD, with the hope of attracting more users to its VR hardware and growing its user base.
The more VR adoption increases, the more momentum it builds. While VR is still far from being an essential technology, these advanced systems and fresh experiences are laying the groundwork for Meta’s vision of VR and the metaverse.
**Celebrity Voices for Meta AI**
In an unexpected move, Meta has also introduced celebrity voices to its AI chatbot. While it’s unclear why Meta sees this as a key step in broadening AI adoption, it seems to be part of their strategy.
The main new feature is that you can now use your voice to interact with Meta AI across Messenger, Facebook, WhatsApp, and Instagram DMs, and it will respond to you out loud.
Additionally, you can choose a celebrity voice for your Meta AI chatbot, with options like Awkwafina, Dame Judi Dench, John Cena, Keegan-Michael Key, and Kristen Bell. So when the AI replies, it’ll sound like a famous star. Pretty cool, right?
While this could be an interesting novelty, it’s unclear why Meta views this as a valuable addition. They’ve tried similar ideas before with celebrity-style AI chatbots, which were eventually shut down due to lack of interest. They’re also allowing influencers to create AI versions of themselves to respond to fans, but it’s not clear that this concept resonates. You’re not actually interacting with these celebrities, just AI versions of them. Is that really what people want? So far, it seems not—but Meta is pressing on with the idea.
What’s more, reports suggest Meta paid millions to secure the rights to use these celebrity voices.
I’m skeptical this will become a major trend, but maybe some die-hard fans of Dame Judi Dench will enjoy having a robot version of her answer their Meta AI questions.
**Meta AI Image Context**
Meta is also enhancing its AI capabilities, with Meta AI now able to generate answers based on visual cues.
Meta’s AI chatbot has improved its ability to comprehend visual elements, allowing it to provide answers based on these visuals.
Additionally, the system can now edit existing images, so you can request modifications, such as adding new elements to a picture.
You can also request it to remove or alter specific elements in an image, and it will be more capable of handling those requests.
“And if you want to share a photo from your feed to your Instagram Story, Meta AI’s new backgrounds feature can analyze your photo, understand its contents, and generate a fun background for your story.”
Meta is also introducing AI translations for Reels, allowing creators to connect with a wider audience through their content.
The audio translations will mimic the speakers’ voices in another language and sync their lips to create a more authentic translation experience.
Meta reports that it is initially testing this feature with selected creators on Instagram and Facebook.
More AI Features
Meta is also enhancing its “Imagine” AI feature, offering additional ways for users to generate imaginative AI representations of themselves within its post composer options.
Meta is also introducing AI-generated chat themes in Messenger and Instagram DMs, along with tailored AI content recommendations based on your interests.
AI for Business
Meta will enable more businesses to develop their own AI chatbots, powered by its advanced AI models, which will be accessible through click-to-message ads on WhatsApp and Messenger.
“From addressing common customer inquiries to discussing products and completing purchases, these business AIs can assist companies in engaging with more customers and boosting sales.”
This could be an effective way to enhance customer engagement, providing immediate responses and service around the clock.
Meta also reports that more advertisers are utilizing its generative AI ad tools, with over 15 million ads created using these features in the past month.
“On average, ad campaigns leveraging Meta’s generative AI ad features achieved an 11% higher click-through rate and a 7.6% higher conversion rate compared to those that did not utilize these tools.”
Additionally, more Meta advertisers have seen success with Meta’s Advantage+ campaigns, and as its AI systems continue to evolve, they appear to be delivering better results.
There’s a lot to consider, and Meta has also launched a new version of its Llama language model, which will open up expanded development opportunities.
Exciting developments are taking place at Meta HQ, each with varying degrees of interest and potential impact.