Adobe has a big responsibility in that regard as the engine powering the world's digital artists to mitigate such circumstances. In the first quarter of 2025, Adobe is launching its Content Authenticity web app in beta, which allows creators to attach content credentials to their work, accordingly attesting their work as their very own.
That is not as simple as tampering with the metadata of an image, however: that level of protection is easily circumvented by a screenshot. Content credentials push provenance a step further. Adobe's system employs digital fingerprinting, invisible watermarking and cryptographically signed metadata for the even greater protection of an artwork, including images, video, and audio files.
In the case of invisible watermarking, pixels change only to a level that is imperceptible to the human eye. A digital fingerprint is the same; it embeds an ID into the file, so the removal of content credentials would not help an individual since a file could be traced back to its owner.
We can truly state that wherever an image, or a video, or an audio goes, on anywhere on the web or on a mobile device, the content credential will always be attached to it," Adobe's Senior Director of the Content Authenticity Andy Parsons said to TechCrunch
Opt-in initiatives like this are only as powerful as their uptake. But if any firm can muster a quorum of digital artists and creators, it's Adobe, which counts 33 million subscribers who pay for its software. And even artists who aren't Adobe users can use the web app to apply content credentials.
Finally, there's the question of how to get content credentials onto the internet in general. The company has co-founded two industry groups focused on preserving authenticity of content and developing trust and transparence online. It has membership by camera manufacturers representing 90% of the market, content-creation tools from Microsoft and OpenAI, and platforms such as TikTok, LinkedIn, Google, Instagram, and Facebook. By owning membership in these companies, it does not mean they will install Adobe's content credentials into their products, but it does mean that Adobe has their ears.
Not all platforms and websites, however, visibly display provenance information.
"To bridge that gap in the meantime, we'll release the Content Authenticity browser extension for Chrome as part of this software package, and also something we call the Inspect tool within the Adobe Content Authenticity website," Parsons said. "These will help you find and surface content credentials wherever they are associated with content anywhere on the web, and that can again show you who made the content, who gets credit for it."
It can be ironically funny, considering that AI is not so great at being able to tell whether something is AI or not. With the challenge of telling real images from synthetic ones, these tools could really provide a more concrete method of determining where an image came from (as long as it has credibility).
Adobe does not oppose AI in all its forms. In fact, the company appears to be keen on trying to very clearly make it quite evident when the work created was done with the help of AI and also making sure that artists' work is not surreptitiously pumped into training datasets. And even Adobe has its own version of a generative AI called Firefly trained on hundreds of thousands of images from Adobe Stock.
"Firefly is commercially safe, and we only train it on content that Adobe explicitly has permission to use, and of course, never on customer content," Parsons said.
According to Parsons, while artists have been quite resistant to the AI tool, the integrations of Adobe Firefly into the majority of its applications, including Photoshop or Lightroom, have received positive feedback. According to Parsons, the generative fill feature of Photoshop, which extends images through prompting, had a 10x adoption rate over a typical Photoshop feature.
Adobe has also been working with Spawning, a tool empowering artists to keep control over their artworks' use online. Through its website called "Have I Been Trained? ", Spawning lets artists search to see if their artworks are in the most widely-used training datasets. Artists can add their works to a Do Not Train registry, signaling to AI companies that this work should not be included in training datasets.
This can only work if AI firms respect the list, though HuggingFace and Stability are already signing on. Next Tuesday, Adobe is announcing a beta for Content Authenticity Chrome extension. Creators can also sign up to be notified when the beta for the full web app goes live next year.