"We know more about the surface of Mars and the Moon combined than we do of our own ocean floor." A startling truth, but Ved Chirayath is looking to change that.
Chirayath, a research scientist at NASA Ames Research Center in Silicon Valley, California, received a grant to develop his software and hardware known as Fluid Cam. One difficulty of photographing what lies beneath the surface of our oceans is that top-down imagery from drones, or even satellites, cannot see past the movement of the waves. Not even a CPL filter is going to fix that I'm afraid.
The software technique Chirayath has developed is known as fluid lensing and it can see clearly through the movement of the water to photograph reefs, for example. The camera overcomes the distortion the water can create and allow us to map out a lot of the ocean, wholesale. It's currently being paired with drones in its testing phase, with the end-game being to have it attached to UAVs or satellites for more complete imagery. The next step will be to pair the data with a supercomputer with machine learning to form a comprehensive database of Earth's reefs, giving researchers an unprecedented understanding and insight.
I'm eager to learn more technical specifications of this hardware, particularly how its sensor works. Also, is that a Leica lens I spy on the front?