Photos courtesy Scientific American |
Scientific American says: "Sea-thru's image analysis factors in the physics of light absorption and scattering in the atmosphere, compared with that in the ocean, where the particles that light interacts with are much larger. Then the program effectively reverses image distortion from water pixel by pixel, restoring lost colors. One caveat is that the process requires distance information to work. Akkaynak takes numerous photographs of the same scene from various angles, which Sea-thru uses to estimate the distance between the camera and objects in the scene—and, in turn, the water's light-attenuating impact."
----
Sea-thru Brings Clarity to Underwater Photos
Link to YouTube
3 comments:
Very interesting technology, I think I saw some examples on Instagram somewhere as well. I wonder if it also takes into account the depth, not just the distance from the photographed object, and the location, as in which sea/lake the photo was taken in.
Thom, might this help?
http://gurneyjourney.blogspot.com/2010/01/color-underwater.html
From the graphic in the video, I would think it does take it into consideration. On the other hand, since she already knows the color values in open air, based on the color chart she puts at the same depth, it may not necessarily be a part of the algorithm. All I know for sure is that it makes me want to go diving again. 🤗
Post a Comment