New Technology Can Detect Photoshopped Fakes, Even Reveal the Original Images

New Technology Can Detect Photoshopped Fakes, Even Reveal the Original Images

Photoshop is arguably the most powerful tool at a photographer's disposal. And like many powerful tools, it can be used for either good or harm. The good news is that technology has recently been developed to counter nefarious and sneaky image manipulations.

Adobe and the University of California Berkeley have teamed up to develop new software that detects facial manipulations that have been implemented on images by Photoshop's Face Aware Liquify tool.

In a study, both humans and AI and computers were tested on their ability to detect altered facial features. Not surprisingly, AI was the overwhelming winner. But what is surprising is that AI not only could detect the facial "fakes" with 99% accuracy, it could even revert those images to their original state.

In the age of Photoshop and of facial manipulation apps, the lines between reality and fantasy have become blurred. Photographers have been caught and ousted for cheating on contests with composites, while photojournalists have been reprimanded for manipulating seemingly accurate images. And rightfully so.

Long before we had facial editing tools via AI, the ability to detect image edits like cloning, splicing, etc. already existed. "Fake detection" technology in photography continues to become more sophisticated, but it will have to keep improving to keep up with newer image editing tools that Adobe and others put out.

We've all looked at selfies and said to ourselves "Well, that's not how he/she really looks." Have you ever detected anything fishy with a photograph you've seen, beyond “that was an obvious filter?" Share your experiences in the comments section below.

Photo by Moose Photos from Pexels

Log in or register to post comments

8 Comments

Tinder's about to get a lot more interesting...

Ansel Spear's picture

And Grindr for that matter.

Linda Trimm's picture

I just googled grindr. lol

Michael Jin's picture

Seems rather limited in usefulness given that it only works on manipulation performed by a very specific tool.

Scott Mason's picture

Agreed, but we can only assume that more detection ability is to come. That's the nature of technology, after all.

Linda Trimm's picture

DARPA Medifor (Media Forensics) funded this project, but the tool was not "hacked" or reverse engineered, they partnered with Adobe, the creators of the tool to warp faces in images so of course they can write scripts to detect algorithms that were provided by the creators. How is that research? Whatever, just an observation. And it's not limited at all, Adobe produced the tool but it's not out of reason to assume any other facial manipulation software created at a later date won't be drastically different than Adobe's.

Laughing Cow's picture

It would be simpler if we could have a technology to detect non-photoshoped images…

John Sammonds's picture

Who cares if an image is manipulated, most images are there for only seconds then forgotten. just enjoy the moment