This Remarkable Algorithm 'Removes the Water' From Underwater Images

An engineer and oceanographer has created a remarkable algorithm that can "remove the water" from underwater images, giving them back remarkable color and clarity, as if the water was never there at all.

Derya Akkaynak, an engineer and oceanographer, has developed an algorithm that can "remove the water" from underwater images. One of the primary issues with underwater photography is that different wavelengths of light penetrate to different depths, creating increasing color shifts with depth. The other issue is haze introduced by backscattering, which reduces clarity more and more with increasing distance between the subject and photographer. The method, called "Sea-Thru," does a remarkable job at restoring proper color and clarity in the image, essentially making it look as if the water wasn't there in the first place. 

Akkaynak has clarified that the method does not require the use of a color chart. The algorithm only needs multiple images of the scene lit by natural light. 

There's no word yet on if the algorithm will make its way into commercial uses or appear in any sort of software, but I certainly hope to see it make its way there eventually! The results are quite amazing, as you can see in the video above. 

If you would really like to dive into the math behind the algorithm, you can read the full paper here.

Alex Cooke's picture

Alex Cooke is a Cleveland-based portrait, events, and landscape photographer. He holds an M.S. in Applied Mathematics and a doctorate in Music Composition. He is also an avid equestrian.

Log in or register to post comments
15 Comments

Wow. Sounds so interesting. :)

This sounds like white balance + dehaze tool to me, but i may be wrong.

It's not a flat color shift.

"It's a physicly acurate correction rather than a visually pleasing modification"
Why do you think some photographer shoot with charts ? Why is the white balance written in kelvins ? For physicly accurate corrections.
I'm not saying their tool is lame, I just say they didn't invent anything.

Jpeg from raw straight out of go pro
vs
corrected image : White balance and radial filter with dehaze and white balance.

Yeah this examples show just how different her algorithm is from trying to adjust a raw file. The turtle doesn’t have accurate color in either and sun looks way better in the top image. I’m not sure how she is doing it but the color loss underwater def degrades images in a way that makes it hard to manually fix them.

You've never seen a turlte, did you ? That one was covered in moss. And the sun was blown, so yes, the haze filter screwed it. Look at the reef far away.
On a go pro, you put a red filter (which I hadn't put on there) to give the picture the right color deep underwater, because blue passes way more. That is called white balance shift and correction, which you can work on in photoshop.

I think its more like an algorithm using multiple photos to create a custom LUT for the image.

The main advantage seems to be the 3D maping, letting know where and how much to put the correction.

In my day photographers knew how to color correct. This is an easy fix.

Way back in 2010?

If it was that easy, why Photoshop, Gimp etc did not put this opt to their filters? Congrats dude.

Please don't cll her 'dude'.

Sorry, did not know that developer was a lady. Derya is a common name for both female and male. Apologies for any inconvenience caused.

This has pretty huge potential, and not just for underwater stuff, but also for removing atmospheric haze, perhaps? In a better way than the dehaze slider does it. Anyone who's shot underwater images knows dehaze won't make your shots look like this.

The downside is that this does require some pre-shoot setup and planning ahead, not just something you can slap on to your photos in post.