Instagram recently launched its “Made with AI” tag, which appears to tag AI-generated images on the platform, but has angered photographers after it tagged photos that weren’t AI-generated.
Instagram has revealed very little about how it detects AI content, saying it only uses “industry standard indicators”. With this in mind PetaPixel have tried to find all the concrete ways that the “Made with AI” sticker will be attached to Instagram’s past.
Which Photoshop tools will trigger the “Made With AI” label on Instagram?
Generative fill
Perhaps one of the biggest bugbears for photographers is that a small tweak to an image using Photoshop’s artificial intelligence tool results in a post being labeled “Made with AI”.
As in my previous tests, when Instagram removes a speck of dust in Photoshop using Adobe’s Generative Fill tool, it tags the photo as “Made with AI” after it’s uploaded. This is despite the fact that I could have achieved the exact same result with the Spot Healing Brush Tools or Content-Aware Fill, which do not trigger AI marking.
Of course, you can also make major adjustments to a photo with a Generative Fill, just like below where I’ve used the Reference Image feature, which allows you to upload a reference photo as a guide for the Generative Fill. This will also trigger the tag.
Generative unwrapping
Generative Expand, the tool that appears as an option when cropping from an image, is powered by the same AI technology as Generative Fill, so perhaps it’s no surprise that using this tool will also trigger “Made with Label.”
Notable exceptions in Adobe Lightroom and Photoshop
A generative removal tool
There’s a new Generative Remove tool in Adobe Lightroom and Adobe Camera Raw (ACR), and while it uses generative artificial intelligence technology, it doesn’t bury Instagram’s “Made with AI” sticker.
Generative Removal is one of the latest and most impressive AI-powered features in Adobe Lightroom and Photoshop (via Adobe Camera Raw). Currently in early access (beta), Generative Removal uses Adobe Firefly technology to remove a brush from an image and replace it with new pixels that match the rest of the image.
While the results aren’t always too different from something like Spot Healing, Generative Remove is much more reliable and sophisticated based on initial testing.
Despite the use of generative AI technology, an image edited with Generative Remove, like any other Firefly tool, is not marked “Made with AI” on Instagram. Why?
PetaPixel has contacted Adobe for comment, but browsing through the images’ metadata reveals a likely explanation. Unlike an image edited with Generative Fill, a photo edited with Generative Removal is not tagged with any C2PA information. C2PA data is critical to Adobe’s Content Authenticity Initiative—a growing group in which Meta is notably absent—because it allows software tools to know when an image has been modified.
While a file edited in Lightroom using Generative Removal has a “Firefly” tag in the “Fill Area Retouch Method” field, a photo with a Generative Fill applied has a lot more code in its metadata. There are C2PA flags in the JUMD Type and JUMD Label fields, plus an explicit mention of “Adobe Firefly”. Also in the metadata are the C2PA claim generator flags and the C2PA signature.
Regardless of why Generative Remove isn’t prompting C2PA labeling, at least not yet, the inconsistencies underlie part of the problem with Meta’s “Made with AI” label. It is completely lax in its implementation and does nothing fundamental to reduce the potential harm caused by AI-generated and edited photos.
More tools in Photoshop
in PetaPixel tests, also the following tools No run Instagram’s AI tags: Neural Filters, Sky Replacement, AI-powered reduction noise, and Super Resolution. It’s worth noting that these features use Adobe Sensei AI technology, not Firefly. The Meta brand isn’t “Made with Firefly”, it’s “Made with AI”.
Are AI-Generated Images Triggering Instagram’s “Made With AI” Label?
DALL-E
OpenAI’s DALL-E image generator is a popular choice and triggers the Meta “Made with AI” label.
Adobe Firefly
Considering how some Features powered by Firefly lead to the Meta tag “Made with AI” appearing on Instagram, it should come as no surprise that an image directly generated by Firefly comes with an AI content warning.
Stable diffusion
This means that not all AI image generators cause the “Made with AI” tag to appear. Users can create images in Stable Diffusion, export them from the platform, and upload them to Instagram without worrying about being labeled AI.
Meta’s AI Image Generator
Instagram’s parent company Meta has its own AI image generator, which we cheekily used to see if Meta reports its own AI images.
And even if it gets tagged, a simple screenshot can bypass the tag – even if the AI Meta watermark is still visible on the image itself.
What does Instagram’s “Made with AI” tag actually accomplish?
Given how easy it is to bypass the Meta’s AI detection system—if you know how to edit photos, you can bypass the current Meta checks—it begs the question, “What’s the point?” Instagram’s “Made with AI” label has so far managed to mislabel some real, very non-AI photos, which tarnished the reputation of reputable photographers, and completely lacked photos that were not only truly AI-edited, but not even done in a way designed to disguise the use of generative AI at all.
While the need for some kind of content authenticity is real and growing in importance every day, Meta’s approach is a dead end. All indications point to a simple metadata scraper that is not only easily fooled, but also unreliable depending on the software and individual tools one uses to create the image.
A C2PA-based approach probably makes sense, but Meta has it all backwards. Instead of looking for evidence that an image has been edited with generative artificial intelligence, perhaps Instagram should look for evidence that an image doesn’t have have been edited and these images may receive a label to verify that they are authentic. This type of technology already exists, and the Content Authenticity Initiative is hard at work developing ways to make it available. If only Meta wanted to play ball.
Instead, photographers have to worry about their legitimate images being mislabeled. There’s no doubt that in the age of artificial intelligence, it’s hard to believe what we see online. Even perfectly legitimate images have come under undue scrutiny long before Instagram’s “Made with AI” tags started popping up.
It’s a big problem that needs a thoughtful solution, and the current iteration of Instagram’s “Made with AI” tags certainly isn’t one.
A constantly moving target
We’ve tried to find every photo editing tool that triggers the “Made with AI” label, but there may be more and we’ll update this article if we find any. There’s also a significant chance that the Meta’s seemingly rudimentary and basic AI detection technology will undergo constant changes, meaning that some features that don’t currently trigger the “Made with AI” label may in the future.
More news from Jeremy Gray.