Engineer and oceanographer Derya Akkaynak is the creator of Sea-Thru, a new algorithm that ‘removes’ the water from underwater photos by getting rid of the distortion caused by water. Long story short, this means we can now see the true colors of a coral reef as if you were diving underneath the water.
I know this isn’t the usual type of content you might expect to find here on BroBible but I thought this was pretty damn fascinating and wanted to put it on your radar.
As most people who have ever tried to take an underwater photograph have realized, those photos are likely to come out ‘bland and blue.’ According to Scientific American, this is the case even in shallow water or a pool because the water ‘selectively absorbs and scatters light at different wavelengths.’ For a photographer, this is a massive pain in the ass but from a very practical perspective it’s a real bitch for field biologists using technology to classify underwater species because those images come out distorted.
Enter Derya Akkaynak and her creation of Sea-Thru. This Scientific American YouTube clip shows the drastic before and after changes once Sea-Thru is applied to underwater photographs.
If you’d rather read than watch, here’s the SA explanation on how this all works:
Sea-thru’s image analysis factors in the physics of light absorption and scattering in the atmosphere, compared with that in the ocean, where the particles that light interacts with are much larger. Then the program effectively reverses image distortion from water pixel by pixel, restoring lost colors.
One caveat is that the process requires distance information to work. Akkaynak takes numerous photographs of the same scene from various angles, which Sea-thru uses to estimate the distance between the camera and objects in the scene—and, in turn, the water’s light-attenuating impact. Luckily, many scientists already capture distance information in image data sets by using a process called photogrammetry, and Akkaynak says the program will readily work on those photographs. (via Scientific American)
Unlike Photoshop where users can artificially tweak the color balance of an image to mimic real-life this is actually showing true color.
It’s not perfect yet because it requires the distance information to properly function but this is still a massive step forward for researchers, biologists, and underwater photographers in general. I wish I could say I understood everything about this science but it’s pretty far above my head. What I do glean is that this is going to eventually be awesome for hobbyists after the technology created for scientists is perfected.
If you’re into this, you can click here to head on over to Scientific American and read their full article.