From AI assistants to image restylers: Meta’s latest AI features.

Meta also unveiled a series of AI-powered bots, features, and products that will be used across its messaging applications, the Meta Quest 3, and later, the future Ray-Ban Meta smart glasses.
From AI assistants to image restylers: Meta’s latest AI features.

Meta also unveiled a series of AI-powered bots, features, and products that will be used across its messaging applications, the Meta Quest 3, and later, the future Ray-Ban Meta smart glasses. This ranges from an AI assistant to image editing tools and leverages the power of generative AI to make Meta technology just that little bit more addictive.

Though the new AI experiences and features, according to Meta's wording, "give you the tools to be more creative, expressive and productive, "

The string of AI-focused announcements came during Meta's annual Connect conference, where the company announced its newest mixed reality headset and plans to release smart glasses in partnership with Ray-Ban.

It's all built on Llama 2, a new family of open access AI models from Meta, released in July. The large language model is designed for the use of producing text and code in response to prompts, the company said it was trained on the complex mix of publicly available data. The company noted Llama 3 would come out in 2024.

Meta also announced that it is working on a new image generator, Emu, which it will use to power things like AI stickers and image editing.

Here are all the new ways for Meta to apply AI.

AI Assistant, supported by Bing
Meta's AI Assistant will be able to respond immediately with real-time information and produce photorealistic images from text prompts within seconds. It can assist planning a trip with friends in a group chat, answer general knowledge questions and search the internet across Microsoft's Bing to provide real-time web results.

Meta said using Llama 2, it developed specialized datasets anchored in natural conversation so the AIs would respond in a conversational and friendly tone.

"With Meta AI, we saw an opportunity to take that capability and create an assistant who can do more than write poems," said Ahmad Al-Dahle, VP of GenAI at Meta, on Wednesday. "We built an orchestrator behind Meta AI. And it can seamlessly detect a user's intent from a prompt and route it to the right extension."

In addition, web search, powered by Bing, will help with queries requiring real-time information.

"Whether you want to know the history of Eggs Benedict, or how to make it, or even where to get it in San Francisco, Meta AI can help ensure you have access to the most up-to-date information through the power of search," said Al-Dahle.

The AI Assistant will launch in beta in the U.S. on WhatsApp, Messenger, and Instagram and will soon come to Ray-Ban Meta smart glasses and the Quest 3 VR headset.

AI personality chatbots based on celebrities
Meta has released 28 AI personality characters, all based on famous people but entirely created by AI, ranging from sports and music to social media, etc. You can think of them as topic-specific chatbots with which you can message over WhatsApp, Messenger, and Instagram. Every personality is based on a celebrity or influencer, like, say

Football star Tom Brady's likeness will be used for character "Bru" with the ability to talk with you about sports.
Tennis player Naomi Osaka will feature as "Tamika", talking all things Manga.
YouTube personality Mr Beast plays "Zach" to play out being…a funny guy?
MMA fighter Israel Adesanya is appearing as Luiz to speak on all things MMA.
Kendall Jenner Model Kendall Jenner is playing the part of "Billie" as the big sister type.
Meta's AIs were based on the Llama 2 LLM. Much of the knowledge bases for most of them are restricted to information that largely existed before 2023, but Meta says it hopes to bring its Bing search function to its AIs in the coming months.

The characters won't just respond by text — they'll also be able to speak next year. But for now, no audio. Any video you may see today or in the future is based on AI-generated animations. Meta filmed people representing different AIs and then used "generative techniques" to combine those different animations into one cohesive experience.

Meta wouldn't disclose how it compensated the celebrities for the rights to their likenesses.

AI Studio for brands and content creators
Meta's AI Studio will allow businesses to design AI-powered chatbots for Meta's various messaging services: Facebook, Instagram and Messenger.

Starting with Messenger, AI Studio will empower businesses to "build AIs that reflect the brand's voice and elevate the customer service experience," Meta said in a blog post. During the on-stage demo, Meta CEO Mark Zuckerberg elaborated that "most" of the use cases it is envisioning fall under e-commerce and customer support.

AI Studio will ship initially in alpha, and Meta says that it'll scale this toolkit further beginning next year.

In the future, creators will be able to tap the AI Studio, as well, to build AIs that "extend their virtual presence" across Meta's apps. According to Meta, these would need to be sanctioned by the creator and directly controlled by them.

In the coming year, Meta will build a sandbox so that anyone can experiment with creating their own AI, something that Meta will bring to the metaverse.

Emu powered AI stickers, by Meta
Meta CEO Mark Zuckerberg yesterday announced the roll out of generative AI stickers across Meta's messaging apps. The feature, powered by its new foundational model for image generation, Emu, will enable people to create unique AI stickers in a matter of seconds across Meta apps including WhatsApp, Messenger, Instagram and even Facebook Stories.

"Every day people send hundreds of millions of stickers to express things in chats," said Zuckerberg. "And every chat is a little bit different and you want to express subtly different emotions. But today we only have a fixed number — but with Emo now you have the ability to just type in what you want," he said.

You might type into a text box with exactly what kind of images you want to see. The feature, dubbed "view once stickers," was demoed in WhatsApp, where Zuckerberg showed off crazy ideas like "Hungarian sheep dog driving a 4×4," for instance. According to Meta, it takes three seconds on average to generate multiple options to share instantly.

The feature will first be available to people with a primary English language and will roll out over the course of the next month, said the company.
AI image editing — restyle photos, add a backdrop
Meta says soon, you can turn your images into anything, or co-create completely AI-generated images with friends. These two new features-restyling and backdrop-are coming to Instagram in the U.S. soon, powered by Emu's technology.

Restyle allows you to redesign visual styles of a photo, Zuckerberg said during his demo on a picture of his dog, Beast, where he used AI to transform it into origami and cross-stitch styles, simply by typing in such prompts as "watercolor" or even something much more specific, such as "collage from magazines and newspapers with torn edges," as Meta explained in its blog post.

Background, however, changes the scene or background of your image through prompts.

Meta says the AI images will include that they are AI-generated "to reduce the chances of people mistaking them for human-created content." The company further mentioned it is also testing types of both visible and invisible markers.

A word on safety
"If you spent any time playing with conversational AI, as you probably know, they have the potential to say things that are inaccurate or even inappropriate. And that can happen with ours, too," said Al-Dahle.

On stage, Al-Dahle drew a pretty graphic outlining the "thousands of hours" of red-teaming and working with prompts to train its AI assistant and characters to steer clear of iffy topics. Red-teaming is an iterative process where you try to get the model to say harmful things, apply fixes and repeat. Continuously.

Meta says it's also releasing system cards alongside its AIs so people can understand "what's inside and how they were built."

 

Blog
|
2024-11-13 21:35:22