In the last week, Instagram has been automatically labeling any photo touched by artificial intelligence tools as "Made with AI." On the surface, this sounds like a win for photographers decrying the use of AI as intellectual property theft. But in practice, the way Meta has decided to call out AI leaves little room for nuance and debate.
Buckle up, because this is probably a hot take, especially coming from a photojournalist.
Earlier this week, I used my phone and uploaded to Instagram a photo of famed New York City Photographer Louis Mendes at Photoville. I made fairly ordinary Photoshop edits that are typical in my non-journalistic work, and in the process, I lassoed a small section at the edge of the frame and told generative AI to remove the highlight. Here's a before and after of the results:
Imagine my surprise when upon upload, prominently at the top of the post, Instagram labeled it "Made with AI."
Well, no. I lassoed a small section of the photo and had AI make an edit. It's the same edit that I could have made with the clone tool and a little extra time. The end result would have been almost exactly the same. But one photo would get flagged and one would not.
It's in this distinction that the broad brush of "Made with AI," though noble in its pursuit of the truth, fails. The label seems to imply that this image was whole-cloth made with AI when it simply wasn't. AI was used as a retouching tool, much in the way the dodge or burn tool would be used, or the clone tool, or the healing brush. To single out the use of the generative AI tool for this label seems to misunderstand how AI was used in this case.
Sure, if DALL-E or Midjourney generated this image of Louis Mendes out of thin air, the "Made with AI" label applies. But I don't believe it should in this case, as Mr. Mendes was standing there, as sure as I'm typing this sentence (the snake and the dinosaur were not, as I'll get to in a moment).
This could have a chilling effect on retouching in general. Here's another example where the "Made with AI" label would make no sense:
The only AI "crime," so to speak, in this photo is using it to remove the front license plate of this car. It's something I would file under the category of retouching work and not generating an image completely with AI. If an edit like this is demonized, why even have AI tools in the first place?There are other problems here. One of the commenters on my post asked if using an AI Denoise function in Photoshop would trigger this label. It's something that would look very bad for event shooters using this tool for clients. I tried this out, and it appears that using that tool doesn't apply the label in Instagram. Instagram's help page about the label was cryptic about it, saying that it looks at "industry-standard signals" to make its determination. I have seen AI noise reduction introduce some gnarly artifacts and made-up faces to photos, so it's not immune to fabrication either.
Further, the photo at the top of this post, very clearly made with AI and labeled as such, receives no such designation when uploaded from a desktop web browser. That's a pretty big loophole and quite the uneven application of Instagram's AI policy.
Creatives have long adapted new tools to make better art, whether for client work or personal satisfaction. But for a huge company such as Meta to appoint itself the arbiter of how the "Made with AI" label is applied is a huge misstep. It paints a scarlet letter on folks that are responsibly using AI.
Yes, journalists and other purveyors of the truth should never use such tools to edit their work, but should a wedding photographer have to deal with a lawsuit from a bride seeing the "Made with AI" label when they upload wedding photos to social media? Should a company tweaking a photo to hide a wardrobe malfunction on social media face backlash when the photo is labeled as AI?
These are questions to which Meta doesn't seem to have fully contemplated.
But it's something that photographers will certainly be contemplating.
Do you have thoughts on the new "Made with AI" label on Instagram? Leave your thoughts in the comments below.
The "Made with AI" label is definitely important and needed. But I think you need to question "why" those 2 photos got the label? Nothing in those edited photos visually show it was altered by AI, it's no different than someone would do in Photoshop. So what flagged it? Is there metadata in the exported photo that Photoshop or whatever program you used added to the photo that says it's made with AI? or is it a blip in their AI checking algorithm?
According to the Adobe forums, metadata is added anytime you use any of the generative AI features - so is this really Instagram's fault or Adobe's fault or both?
https://community.adobe.com/t5/photoshop-ecosystem-discussions/generativ...
If it's metadata it should be easy to remove. I will do some Testing. Thanks for the Link.
It's in the metadata (Content Credentials) and can be removed easily. The fun part is that the description written by Adobe is a bit more descriptive and makes way more sense.
You can try out an image on the Content Credentials website:
https://contentcredentials.org/verify
--- "so is this really Instagram's fault or Adobe's fault or both?"
Both. They are coordinating with each other. And, from what I read, they are expanding their collaboration with others.
I don't really get the frustration. Did you use "AI" tools? Yes!
If you don't like it and you "could have made with the clone tool and a little extra time" then you can avoid this problem. It is your choice to use those tools.
Time is money, right?
I can make the same image with film and a darkroom, so why not go do that? It's being shamed for the choice that seems, to me, to be a legitimate one under certain circumstances that's the problem.
Why don't we start labeling all photos that have been cropped, since they're removing key context? Or what about photos that have used the healing brush, which is kinda like a pre-AI AI-type tool?
See what I'm getting at?
I can understand that, but it is a choice: one can save time using "AI" tools and accept the "AI" tag or decide not to use it and spend more time to have no tag. Simple.
Different retouching tools are also an important topic and I believe that any picture that had some parts replaced, cloned or moved should be labelled. Some countries have laws enforcing that.
Cropping is a different story, because it doesn't replace anything or create things that were not there.
The Mazda picture above is a great example. It looked different than this picture. There is no way to remove the license plate and show the whole front by simply cropping. If you removed the plate before taking the picture it would also be different, because there would be mounting points left. This picture is simply not real.
The picture is real. it has only been retouched. There is no point in being purist about this, otherwise photography would be simple point and shoot.
You talk of how removing the license plate can't be done with cropping, so it makes the image unreal. But what about removing other things that are at the edge of the frame and so can be cropped. It is all the same - removal...
If we're to be purist about things then use the standard preset and only shoot JPEGs, or process and print your film with a fixed set of times, (no dodging / burning).
But every time we choose to frame something we are directing the eye to a story WE want to tell, and that often isn't the whole story - so we'd better get the extreme wide lenses out.
All of photography is about directing a narrative that we want to push. Whether that be tweaking pixels, dodging and burning to cheat the light and direct the eye, or cropping to only show what we want.
To say that retouching a licence plate away makes a picture "unreal" is the same as any other adjustment - even if it is just framing something to suit our personal narrative. Because we can't pick and chose what we consider an acceptable change just to suit our own personal bias.
You didn't understand what I wrote about cropping or you are pretending to not understand it.
I couldn't disagree more about the photo of the car being real.
If you took a picture of a car for sale and removed scratches on paint would it still be real? Claiming that it's a real picture of a car would be a lie. Not showing the side with scratches could be called misinformation or manipulation, but not a lie (as long as you don claim that there are no scratches). Showing a part witch scratches removed is a lie and that's the main difference.
This picture is the same. Claiming that the car looks like this would be a lie, because in real life there would be a license plate or mounting points, so the picture isn't real. If you cropped out the license plate it would be a real picture showing a part of the car.
If you're a photojournalist, you can crop out key information from photos and hide the truth that way as well ...
I agree with you, the picture with the license plate removed is not a real photo because the mounts under the plate are also removed. The car has never, and can never, look like that without some body work being done and then getting a spot paint job where the body work was done. The photo shows something that does not exist in real life.
There's a big difference between "Made with AI" vs "Used AI". The former implies the whole image was generated.
Using an AI tool to do subtle edits like removing the highlight, and then claiming it was "Made with AI" is simply a lie. When a new house is built, they generally use drywall to make the walls. The people installing that drywall use a razor blade to do this job. It's like saying the house was "Made with a razor blade."
Context matters. Meta doesn't seem to understand that.
There is a pretty easy workaround to this dishonestly, though; probably several of them:
In Photoshop create a new document > copy the original > paste it onto the new document > flatten > save as usual.
Bob's your uncle.
So, I gave that method a try - it's not just in the metadata of the file, it's something at the pixel level, because copying and pasting into a new document, then uploading into Instagram still triggers the "Made with AI" label.
It seems currently the only way to avoid that is the janky Instagram uploader from a web browser on an actual desktop or laptop computer.
Interesting, it works for me. The exact steps that I take are: size my image to the size I want to export > create a new blank document that is the same size > select the image and copy it > paste it onto the new document > export as .png. Something I didn't mention previously because I didn't think it would matter but maybe it does is that I filled the background contents with black.
I had an issue on a recent shoot I did in which I had a dirty sensor causing dust spots. I used generative fill to quickly address them. Instagram was hitting them with the "Made with AI" lie.
Ah, looks like you've discovered another workaround. I was exporting to JPG files and that was triggering the label. When I did the same exact steps but with a PNG file, then it didn't add the label. Intriguing.
It is intriguing. I always export to .png. If I export to .png without doing the paste onto a black empty new document, it flags with the label. Doing the workaround, it does not flag it.
The only time I've used the generative AI tools in PS is to address dust spots, or if I crop and straighten the horizon and use the generative fill to fill in the sliver of blank space.
One thing that I will say is that if an image is actually made with AI--in other words created primarily with AI--I personally think it's a good idea to indicate it as such.
They will certainly plug that png leak once they figure it out though. Something more generic and broad is needed.
That sort of reminds me about a decade ago, this one photographer would upload her images to Facebook as PNGs. Her images would never get compressed and converted to JPGs. They remained PNGs at about 1 meg vs typical 150-400 kb JPGs. They were sharp and crisp compared to other images during that time. She told us her settings but we could never replicate it. Fast forward to today, I see her images are now jpgs.
With that said, I'll probably start uploading PNGs to IG and FB. I'll take any advantage I can get.
--- "It seems currently the only way to avoid that is the janky Instagram uploader from a web browser on an actual desktop or laptop computer."
That's how upload all my images. It only makes sense since it's on my main computer. No need to transfer (or use) any other device.
I also much prefer to upload to Instagram from my desktop computer. It is not a "janky" or awkward process at all. In fact, making posts from my real computer is much smoother and easier than using my phone.
Generally speaking, using phones really sucks for almost everything, and using a real desktop with a nice big monitor and full size keyboard is really great and easy for almost everything. Phones are what you use only if you're in a real jam and won't be able to get access to a real computer for quite some time.
Why not just take a screenshot instead, and post that?
That insures that NO data is attached to the file. Works equally well, and easily, from a phone, tablet, desktop, or laptop.
There's a huge resolution penalty with that, though.
Not necessarily. With my nice 27" 5k monitor, screenshots are actually pretty good resolution.
Also, the context of this article and discussion was posting images to Instagram. How much resolution do you want for Instagram posts? Most of the photographers I know who post to Instagram as part of their professional endeavors are downrezzing to 1200 or 1600 pixels anyway. Given that, I do not see how the screenshot is penalizing resolution, as I don't think there are any modern phones or devices with screens less than 1200 pixels.
There's no need to worry about this at all.
Whenever you don't want the details of the photo or the processing to be known, just upload a screenshot of the image instead of the edited / exported version of the file. That's always kinda been a general practice for such instances when you want to keep others from knowing certain details behind the photo. If you want people to know the exposure, but not the processing, then just upload a screenshot, but write the exposure settings in your comment/caption. With just a few seconds of effort, you can present whatever info you want and keep other info to yourself. Literally just a few extra seconds.
It may not be as easy as removing the metadata or taking screen captures. It appears they are/will be adding invisible watermarks. I understand some folks were able defeat the labeling, but, it could be because the invisible watermarks haven't been fully implemented; or not supported by all image writers; or it's not working.
(attached image)
https://about.fb.com/news/2024/02/labeling-ai-generated-images-on-facebo...
Also, if you were curious where in the metadata it's stored, here's a few lines (attached image) using the article's image with the headlight removal. You need a tool like the latest ExifTool to see it.
Yet Apple still can't put a serial number in the metadata for iPhones taking photos?
There's a big difference between "can't" and "decide not to". My thought is that Apple has decided not to put a serial number in because it will not improve their profits, all things considered.
Excellent info, Eddie! Thank you for sharing that.
Do jpg files created in PS include the metadata that would get an image flagged?
Yes.
Shoot. I guess a "scrubbing" program is in order.
The first thing to understand is that Meta has chosen this policy to limit their liability from users posting nefarious and malicious content. It's simply a form of censorship. It's really the same as when someone posts an opinion on Facebook that Meta doesn't agree with and they slap a warning on it. Meta doesn't care about photographers posting Photoshopped images but they do care about censoring people with alternate social and political views. Just remember social media is worth exactly what you pay for it, and when something is free you are what's being sold.