Why Are Some Cameras Rubbish at Taking Photos of Rainbows?

Camera sensors are incredibly complex pieces of engineering prowess, bringing together mankind’s attempt to replicate the behavior of the human eye in perceiving light, but there are still many limitations. Cameras are rarely good at capturing decent photographs of rainbows, but some cameras are significantly worse than others, thanks to a strange quirk of science.

Coming to you from self-confessed science geek Steve Mould, this fascinating video gives you a quick overview of how the eye perceives the color system, how screens and monitors work, and a little known fact about how violet is created. As you might be aware, a computer screen doesn’t contain violet pixels, so how can it create it given that it doesn’t sit in between red, blue, and green when you look at a color spectrum?

This quirk is part of the reason why many cameras with less sophisticated sensors are completely incapable of perceiving certain parts of the color spectrum, and if you’ve ever felt that your photos of a rainbow are not only a bit disappointing but also seem to be missing a fair amount of color, this could be why. For example, if you encountered a double rainbow all the way across the sky, you might be gutted to learn that you were trying to document it using something that simply doesn't perceive one of its seven colors.

If you just learned that you own a camera that doesn't see violet, be sure to post your most disappointing rainbow pictures in the comments below.

Andy Day's picture

Andy Day is a British photographer and writer living in France. He began photographing parkour in 2003 and has been doing weird things in the city and elsewhere ever since. He's addicted to climbing and owns a fairly useless dog. He has an MA in Sociology & Photography which often makes him ponder what all of this really means.

Log in or register to post comments
4 Comments

Ain't nobody got time for that!

Thanks, reading your comment saved 14mn of my life!

Calibrate your camera's color. I've had a few cameras that reproduced purple as blue. But after calibration, that was solved. This was what actually got me into color calibration in the first place. Video is a bigger issue, as the colors are generally baked in unless you're shooting RAW. There, you have the manufacturer to blame for a poor profiling job. Even smartphone-sensor-sized GoPro sensors are capable of capturing purple.

The cone spectral responses shown in the video are misleading. They do not take into account the filtering due to the eye's lens and the macula itself. For practical purposes, there is no bump for the red cones in the blue area. See, for example, wikipedia for the correct curves. https://en.wikipedia.org/wiki/Spectral_sensitivity The "circular" perception of color perception mostly come from later processing in the retina, where color gets reorganized along the blue vs. yellow and red. vs green axes, with desaturated colors in the center.