Some photographers obsess over color, spending thousands on calibration equipment and high end monitors. A recent popular news story showed why this might not be worth it.
A few weeks ago, Game of Thrones aired “The Long Night,” an episode set entirely at night, featuring high quality CGI and budget-busting battle scenes. The day after, however, all everyone could talk about was how they couldn’t see any of it. The episode was very dimly lit, but surely a multimillion dollar production with award winning DPs knew how to shoot a night scene, right?
I’d argue they executed it perfectly. On a well calibrated TV, in a light controlled room, the episode looked fantastic. The shadows were inky black, but I never missed any of the action. It was only when I asked some friends, who mentioned that they couldn’t see anything, where they watched the episode that the complaints made more sense. They were watching in a brightly lit room, with whatever garish preset their TV defaulted to, not knowing that this episode would be a stress test of the black levels and dynamic range of their displays.
When questioned about the episode, the episode’s cinematographer Fabian Wagner cited viewer’s lighting conditions, video compression for streaming, and a variety of other reasons for the unfortunate experience many had. Essentially, his statement read that because he had an expensive, calibrated monitor, his way of looking at the episode was the only right way. Now I agree, he was right, and the episode was shot and graded correctly, but when your customers can’t view it, that doesn’t matter.
I think this entire issue goes beyond issues with any particular aspect of this episode, but instead reflects a broader issue that faces creative professionals. At the same time that creatives are adopting wider color spaces, HDR for video, and working with even higher resolution, consumers are viewing media in worse conditions. Some problems are even due to the intended behavior of the device- Apple’s True Tone can change the whitepoint mid-video, altering the carefully measured color grading, while some TVs will ramp brightness up or down in a misguided ECO mode.
While I still calibrate my monitors, I don’t rely on them as an absolute truth. With many quality monitors coming in at a Delta E below 2, right out of the box, there’s less of a need for calibration. Instead of trying to perfect calibration, make sure you understand what has a bigger impact on color: monitor color space, color temperature, gamma, and image color space. An error in any of those will have a much more significant impact than calibrated vs uncalibrated. Furthermore, an imperfect calibration is worse than none at all, and the professional calibration tools to do it right can be expensive.
I’ve made sure to test important images across a variety of display medium, and I’ve learned to accept that when the image has left my hands, so too has the control over how it is viewed. Calibration is still important, especially when working with multiple monitors or devices, for print and professional work, or if a device is clearly out of spec (although if the device is drastically off, it probably shouldn’t be used at all). For many photographers, I'd argue a gray card or color checker passport would be a better first purchase than a cheaper calibrator.
Audio engineers have known this for years: mixing on good speakers, but also testing on a cheap pair of speakers. If the reaction to Game of Throne’s recent episode is anything to go by, videographers and photographers need to adopt the practice.
Lead image courtesy of Victoria Heath