Meta announced it's rolling out its first features of generative AI for advertisers, which allow advertisers to use AI to create backgrounds, expand images, and generate multiple versions of ad text based on the original copy. In connection with that is a set of new tools that the company launched, following its Meta Connect event last week wherein the social media giant debuted its Quest 3 mixed-reality headset and a host of other generative AI products, including stickers and editing tools, besides AI-powered smart glasses.
While the AI tools for the ad industry are certainly wild, new products aren't necessarily as freaky as celebrity AIs, which let you chat with virtual versions of people like MrBeast or Paris Hilton-there is, however, a demonstration of how Meta thinks generative AI can help the brands and businesses that drive the majority of its revenue.
The first among the trio of new features allows an advertiser to change the look of their product images by generating multiple different backgrounds. It is similar to the technology used by Meta for the creation of the consumer-facing tool Backdrop, which enables users to change the scene or the background of an image by using prompts. However, the backgrounds are to be generated for the advertiser in the ad toolkit according to their original product images and tend to be "simple backgrounds with colors and patterns," Meta explains. The feature is available to those advertisers using the company's Advantage+ catalog to create their sales ads.
Image expansion is another feature, allowing advertisers to layer their assets with the different aspect ratios required across different products, for instance, Feed or Reels, for example. The AI capability also will be accessible to Advantage+ creative in Meta's Mads Manager, where it will allow advertisers to spend less time repositioning their creative assets, including images and video, for various surfaces, claims Meta.
Meta Ads Manager has an offering known as text variation, through which the AI churns six variations based on the copy from which the advertiser begins. This means variations can emphasize certain keywords or phrases that the advertiser wants to hit. Once the AI produces output, advertisers can edit it or pick one or a combination that best fit their goals. But it will not provide the performance specifics for each distinct text variation, it adds, because reporting is currently ad-based. The more choices an advertiser selects to run, however, the more chances they have to optimize ad performance, it says.
Meta claims that it has already run tests with a small but representative group of advertisers this year, and early evidence from those tests suggests that generative AI is saving the advertisers five or more hours per week, which is equivalent to one month a year. The company admits, however, that there is still some work ahead in order to better customize the output of the generative AI according to the style of each advertiser.
In addition, Meta says there are more AI features down the pipeline, noting it's working on new ways to generate ad copy to highlight selling points or generative backgrounds with tailored themes. Additionally, and as it announced at Meta Connect, businesses will be able to use AI for messaging on WhatsApp and Messenger for chat with customers for e-commerce, engagement, and support.