If an editor sent a photographer out to get some "weather art" to show a hot summer day, the photo above of a child playing in a sprinkler park would probably make a good image for the newspaper, right?
The photo above was manipulated using Adobe's generative AI feature in the latest version of Photoshop. All it took to change the photo was a little bit of lasso tool and a prompt that said "remove person in background." In actuality, the photo looked like this, with another child behind the stream of water:The ease with which these kinds of edits can be made pose a real danger to photojournalism, whose stock-in-trade is truthful images. Already, prominent photo contests are falling victim to staged photography, and when elements of photos can be easily removed or added in seconds with generative AI, it does not bode well for the future of truth in imaging. In the past, photojournalistic frauds such as the Toledo Blade's Allan Detrich would composite balls and remove feet and other items from photos, but it took a lot of effort. Not the case anymore, and there's little way to track such changes.
What Are Content Credentials?
There's a new camera on the block that supports a new standard called "Content Credentials." That camera is the Leica M11-P, an almost $10K rangefinder-style camera. There's a lot of technical bits behind the standard, but the short version is that by using framework from the Content Authenticity Initiative and a standard called Coalition for Content Provenance and Authenticity (C2PA), the camera itself can embed secure metadata directly into the file at the point of creation. Canon, Nikon, and Sony are also all involved with the CAI, but they have yet to release a camera that has these credentials built in.
This would pair with social media sites adding a content credentials icon alongside images so that users can track the image's provenance and edits. On the surface, all of this sounds great.
The problem is two-fold. First, while Leica has embraced open-ish standards in the past, such as Adobe's raw format, DNG, other manufacturers have generally gone their own way. It's a good PR move for camera companies to join such initiatives, but actually doing the work to implement a feature into cameras that a very limited pool of photographers will use is another thing entirely.
And that's the biggest issue here: The world at large doesn't seem to care about authenticity in imaging. Photos are mixed, remixed and remixed again when posted and reposted to TikTok, Instagram, Facebook, etc. I doubt most teenagers or twenty-somethings on these platforms care about the authenticity of a photo. If an editor at the New York Times needs a cell phone photo from someone on the scene of an event, they're going to probably have to look the other way when the submitted photo doesn't have content credentials because the ordinary person taking the photo wasn't using an app that supports it. It also seems to add a lot of work to the workflow to add this authentication later.
All of that is a shame, because the inclusion of this feature on an actual production camera should be bigger news than it has been. Truly, for this to work, all of the major manufacturers and software developers need to be on board and not just pay lip service to the tech by joining a coalition.
If you're curious to learn more about how the technology works, there's a great explainer video here.
What do you think about content credentials? Is it an important feature to you to have in a camera?