NASA Develop New 'Fluid Cam' to Better Photograph the Ocean

"We know more about the surface of Mars and the Moon combined than we do of our own ocean floor." A startling truth, but Ved Chirayath is looking to change that.

Chirayath, a research scientist at NASA Ames Research Center in Silicon Valley, California, received a grant to develop his software and hardware known as Fluid Cam. One difficulty of photographing what lies beneath the surface of our oceans is that top-down imagery from drones, or even satellites, cannot see past the movement of the waves. Not even a CPL filter is going to fix that I'm afraid.

The software technique Chirayath has developed is known as fluid lensing and it can see clearly through the movement of the water to photograph reefs, for example. The camera overcomes the distortion the water can create and allow us to map out a lot of the ocean, wholesale. It's currently being paired with drones in its testing phase, with the end-game being to have it attached to UAVs or satellites for more complete imagery. The next step will be to pair the data with a supercomputer with machine learning to form a comprehensive database of Earth's reefs, giving researchers an unprecedented understanding and insight.

I'm eager to learn more technical specifications of this hardware, particularly how its sensor works. Also, is that a Leica lens I spy on the front?

Robert K Baggs's picture

Robert K Baggs is a professional portrait and commercial photographer, educator, and consultant from England. Robert has a First-Class degree in Philosophy and a Master's by Research. In 2015 Robert's work on plagiarism in photography was published as part of several universities' photography degree syllabuses.

Log in or register to post comments
2 Comments

fast forward this technology and it could be useful for locating submarines. DoD... did you secretly fund this?
Cool project!

Curious now as to what the flat-earth crowd will have to say about this.... lol