The ‘AI Info’ Controversy: Is This Label Undermining Classical Photography? Industry Experts Weigh In.

The ‘AI Info’ Controversy: Is This Label Undermining Classical Photography? Industry Experts Weigh In.

“Ridiculous”, “oversimplistic”, and “immensely frustrating”: photographers worldwide are weighing in on the broad social media label "AI Info". While some industry leaders see the value of the designation, the blanket “AI Info” label is also criticized as being oversimplistic. This rollout is causing some photographers to feel punished for using basic tools in Photoshop to avoid the label. I talked with some of the biggest names in commercial photography and retouching to get a pulse from industry experts. 

Some Commercial Photographers Fear the AI Labels Will Punish Their Clients

My dive into this topic began while I was discussing my frustration with a colleague. I posted a recent image that I took for a skincare client of mine. When I uploaded the image to Instagram, it was given the “AI Info” label.

This image of mine was given the attribution "AI info". The tool I used which led to that label was a minute clean up of the hair with Generative Fill.

For my own social media account, I don’t mind the attribution. It shows that I can use the latest tools. However, my concern came when I thought about my client posting the image. Such a classification reads to many as being a “fake” image. For my client, this could create distrust between her and her consumer. Even though the skin was showing the product, and AI was simply used to clean up flyaway hairs, it received the same label as if I had typed in a prompt to generate an image of a model with serum applied to her cheek — no camera involved.

Here, I used only a text prompt along with a reference image. The reference image here is mine, but any image could, potentially, have been used. 

This image was generated using nothing more than the above prompt. I fixed one nail and although it is not perfect, it is a great example of how these 2 images are receiving the same label regardless of how they were created. 

Resulting image from the above prompt

My colleague, Friedman, expressed the same concerns as a commercial stop motion artist,

I’m trying to dodge any tools that makes the AI label. Frankly, it’s ridiculous to have to try to avoid tools in Photoshop, it has been very disruptive to my workflow.

In INC’s article discussing the social media AI labels, the changes are announced as “Meta and Google are taking steps to tell us when the content we’re seeing isn’t real.”

Isn’t real? Or is completely real but retouched using Photoshop’s current tools?

Did Meta Collaborate With Adobe on the AI Labeling?

A big question on everyone’s mind is, “Was there a collaboration between Photoshop and Meta before rolling this out?” I turned to one of the most respected Photoshop instructors to weigh in.

Known on stage as one of Photoshop’s leading educators, Kristina Sherk (Shark Pixel) contributed this thought.

As a professional underwater photographer and composite artist whose work is often assumed to be AI generated, I appreciate the labels at times. But I do agree that using generative fill to remove small objects like fly-away hairs or trash cans should — in no way — warrant the same labels as other AI generated art. A simple collaborative exploratory call between the social media companies and Adobe representatives would have quickly and easily made this issue arise prior to rolling out the labeling, and the social media companies wouldn't be in the position they're in today. Preparation and a thorough examination of Adobe's tech before launch would have foreseen this issue.

Another Photoshop giant, Aaron Nace owner of PHLEARN understood the intent of the label, but perfectly articulated its failure to differentiate between a real photograph and an image created from scratch by AI.

It is an incredible tool for enhancing imagery, but a blanket label for all AI assisted photos oversimplifies its application. There's a clear distinction between subtle refinements and entirely AI-generated content. It's essential to maintain transparency while also recognizing the artistic integrity of images that have undergone minimal AI intervention. We need a system that accurately reflects the level of AI involvement to preserve trust between creators and audiences.

https://www.instagram.com/reel/C69iMB4SS1n/?utm_source=ig_web_copy_link&igsh=MzRlODBiNWFlZA==

Both retouchers and photographers worldwide condone a lack of differentiation between photographs using minor retouching, and images created in AI from scratch with nothing more than a text prompt. Whether cameras were involved, or 5 words strung together on a keyboard — both media receive the same label.

When speaking with two leading commercial photographers, Karl Taylor and Steven Hansen more great perspectives arose.

If images being retouched by Photoshop tools are receiving “AI Info” labels in an effort to identify computer-generated images that are photo-realistic, why are CGI images getting a pass? Hansen is one of those phenomenally multi-talented artists that makes it hard to tell how he created his images. His photographs are so immaturely flawless they barely look real, while his CGI work is crafted with such expertise that it looks real.

In this article, you can see the side-by-side of these two art forms.  Hansen contributed this thought,

I'm fortunate that in my specialization of liquids and food packaging, AI tools are rarely useful. However, a majority of the creative briefs my clients provide do have some AI elements which can be a very efficient way to generate an initial composite for us to work from. When creating images, there's really no use for something that doesn't provide the exact result I'm looking for. I completely understand social media outlets needing to label potential AI images but it must be immensely frustrating for creatives when improperly applied.

Though he minimally uses AI tools for his imagery, in our interview last year he showed us his skillful use of Houdini and other software to create stunning photo-realistic work. It got me thinking, why are we labeling photographs taken with real cameras as “taking steps to tell us when the content we’re seeing isn’t real” but only applying it to certain tools that create photo-realistic results?

Commercial photographer, Karl Taylor, was more favorable to the labeling, adding the perspective that in France even more invasive labels on photography are required.

With regards labeling of shots, to say they are 'AI Info' I think this is more of an awareness message so that the public can differentiate between what is real and what is not. For example, many shots in Europe have to carry a message to say whether they have been retouched. In France they introduced a law so that beauty images for the likes of L'Oreal etc. have to state on them if the model’s skin has been retouched. This was in part to ensure that young girls were aware that models or skin didn't look this flawless without the help of retouching.

In addition to beautiful bespoke images which he creates for his clients, he also makes use of CGI.

In the commercial world everything always comes down to money, how much will it cost, how quickly can it be done and how good will it look. CGI is a good example: it's much cheaper to create a CGI render of a car advert than it is to shoot it. Most car photography adverts are CGI models of the cars mapped into a backplate image of a location but now with AI even the location could be completely computer generated.

If a photographer captures a car in a real background and uses Photoshop AI tools to retouch, the image is labeled as “AI Info”. However, if the car and background were photo-realistically rendered using CGI it would not.

Is this consistent in the intent to identify images that "aren't completely real"? 

Should Photoshop Identify Which Tools Will Lead to the AI Info Label?

In blog post on the topic, Meta themselves acknowledged that “Generative AI is becoming a mainstream tool for creative expression.” Photographers worldwide use Photoshop to clean up images but with the new changes, some photographers are fearful of not knowing which tool not to use to avoid the label. Friedman expressed the sentiment, 

I’m trying to dodge any tool that might label things like this. I feel like Adobe owes it to us to clearly label anything that will result in the AI label

Conclusion

If you have read my work, you'll know that I’m generally supportive of AI usage in photography. I see it as another tool in my toolbox. A colleague who specializes in landscape photography expressed frustration over how some photographers are now adding the northern lights into their photos with nothing more than a few strokes on their keyboards while he travels to capture them with great dedication. Between scenarios like those, lawsuits from celebrities for deep fakes, misleading political imagery, and deceptive beauty practices — the intention for the AI labeling seems fair. The question is, do we need more nuance in the labeling? Should a photograph with minute retouching in Photoshop be labeled the same as a digital image created from a simple sentence on a keyboard? For many photographers, myself included, the answer is a resounding yes. There should be different labeling for images taken with a camera, than images created with a keyboard. Both are valid, but they are very different. To label them with the same attribution is discrediting to the artist. Do you echo the sentiments that opened this article? What are your thoughts on the matter? Let's not punish hard-working photographers who still use cameras; there must be a better way.

Michelle VanTine's picture

Michelle creates scroll-stopping images for amazing brands and amazing people. She works with businesses, public figures, sports & products. Titled “Top Sports Photographers in Miami” in 2019 (#5) and 2020 (#4), she was the only female on the list both years. Follow the fun on IG @michellevantinephotography @sportsphotographermiami

Log in or register to post comments
6 Comments

This is so strange to me, I don't think I've ever seen this AI Info tag. I do a lot of special effects work that involves heavy use of Photoshop's AI tools and none of my images have this tag. I don't even see it the post you gave as an example on your feed. It makes me wonder if the Instagram is still "testing" this feature and only a small number of people can even see it at this point.

I also use generative fill in photoshop, but never get the ai info tag on instagram. I use photoshop as a plugin in Capture one and export from capture one, maybe capture one doesn’t export the ai flag.

Thats an interesting observation. I know that if you truly want to avoid the label you can screenshot the image before posting it on instagram and when you post a screenshot it has no meta data to read from. Of course, this reduces the image quality notably but it is an option some people are using.

I think this is the second article I've seen here where given the example posts of the AI tagged photos, I don’t see the AI tag when viewing on a computer or phone app. Almost seems like only the owners can see them. Just to get a little nerdy, I looked at the IG code and didn't see any reference to "AI" anything.

With that said, I think the AI tags should be reserved for talentless hacks that post crap like this: https://www.instagram.com/b.natural.photography/

I'm not overly critical of adding AI tags for sky replacements, adding a moon, adding a dreamy floral background, etc, because people have been doing this for decades. It was called composites. Faking the subject's surroundings is nothing new. You don't see "Composite Info" tag on the images.

Thats quite some coding/hacking skills you have! Oddly the label doesnt show on desktop computers, only when you use the instagram app on your phone.