No, that's not a dog flying a helicopter to save people from the raging hurricane floodwaters in Florida. It's an AI-generated image made via ChatGPT. But while I've clearly labeled it as such in the lead of this story, there are folks out there creating and spreading arguably more realistic AI images that purport to be of the current hurricanes.
Take, for instance, this tweet with a picture of a supposedly young girl holding a puppy during what appears to be Hurricane Helene. It was shared by a Republican commentator (and also Utah Senator Mike Lee) to stir the pot with his MAGA fanbase about where hurricane funding was going:
Or, if we're trending into the even more ridiculous, this one of Donald Trump saving a baby from floodwaters:
Is this what readers have to contend with? Having to sift through the work of hardworking photojournalists putting their lives on the line to bring people news from these disaster-stricken areas, while couch culture warriors cook up AI-generated images to capitalize on others' misery?
As a former resident of the Fort Myers area, I'm desperate to know what's happening to my friends and colleagues and the places I frequented. There are precious few photojournalists left in the local newspapers all over the state, making next to nothing as they risk death or injury to take photos. It's a disgraceful and dangerous act to put AI-generated images out there in an attempt to fool people just trying to find information about their loved ones.
How Can You Pick Out a Fake Photo?
With Hurricane Milton tearing across Florida, there's bound to be a deluge of not only water, but fake AI-generated photos. There are a few easy tells that something isn't real.
The two biggest flaws with AI images are usually around fingers and faces. Oftentimes, you'll see the wrong number of fingers on a hand, or the hand will be poorly defined. This goes for feet as well.
Faces are often blurry or malformed, with not enough detail to make anything out.
Appendages or whole beings will often appear to come out of nowhere, as the dog coming out of the top of the helicopter shows.
To me, a lot of the skin generated by AI looks too smooth, as the little girl holding the dog demonstrates. Folks knowingly sharing fake images will often scale down or pixelate images so the quality looks rough and it's hard to tell.
Above all, now is the time to remain an informed reader. We're about to see many more fake AI images trying to capitalize on disasters such as these.
I think a lot of the "common tells" of AI photos like the distorted hands, fingers and faces have mostly been fixed. AI took a big leap over the last years and while there are still image generators with those issues, many of them can produce very good results now. Ones like Flux can produce some almost photo realistic ones, while the others often have a bit of that "stylized editing" look that was so popular years ago with photography.
I think the average person would probably have trouble telling a (not so over the top) AI photo from a real one, while photographers are probably able to get a much better guess at picking which is which. A lot of the "tell tale" signs are gone nowadays though (with the right AI generator)