Game Of Thrones Shows Why You Shouldn’t Calibrate Your Monitor

Game Of Thrones Shows Why You Shouldn’t Calibrate Your Monitor

Some photographers obsess over color, spending thousands on calibration equipment and high end monitors. A recent popular news story showed why this might not be worth it.

A few weeks ago, Game of Thrones aired “The Long Night,” an episode set entirely at night, featuring high quality CGI and budget-busting battle scenes. The day after, however, all everyone could talk about was how they couldn’t see any of it. The episode was very dimly lit, but surely a multimillion dollar production with award winning DPs knew how to shoot a night scene, right?

I’d argue they executed it perfectly. On a well calibrated TV, in a light controlled room, the episode looked fantastic. The shadows were inky black, but I never missed any of the action. It was only when I asked some friends, who mentioned that they couldn’t see anything, where they watched the episode that the complaints made more sense. They were watching in a brightly lit room, with whatever garish preset their TV defaulted to, not knowing that this episode would be a stress test of the black levels and dynamic range of their displays.

When questioned about the episode, the episode’s cinematographer Fabian Wagner cited viewer’s lighting conditions, video compression for streaming, and a variety of other reasons for the unfortunate experience many had. Essentially, his statement read that because he had an expensive, calibrated monitor, his way of looking at the episode was the only right way. Now I agree, he was right, and the episode was shot and graded correctly, but when your customers can’t view it, that doesn’t matter.

I think this entire issue goes beyond issues with any particular aspect of this episode, but instead reflects a broader issue that faces creative professionals. At the same time that creatives are adopting wider color spaces, HDR for video, and working with even higher resolution, consumers are viewing media in worse conditions. Some problems are even due to the intended behavior of the device- Apple’s True Tone can change the whitepoint mid-video, altering the carefully measured color grading, while some TVs will ramp brightness up or down in a misguided ECO mode.

While I still calibrate my monitors, I don’t rely on them as an absolute truth. With many quality monitors coming in at a Delta E below 2, right out of the box, there’s less of a need for calibration. Instead of trying to perfect calibration, make sure you understand what has a bigger impact on color: monitor color space, color temperature, gamma, and image color space. An error in any of those will have a much more significant impact than calibrated vs uncalibrated. Furthermore, an imperfect calibration is worse than none at all, and the professional calibration tools to do it right can be expensive.

Out of the box color accuracy is easy to get, with many monitors now advertising it as a feature.

I’ve made sure to test important images across a variety of display medium, and I’ve learned to accept that when the image has left my hands, so too has the control over how it is viewed. Calibration is still important, especially when working with multiple monitors or devices, for print and professional work, or if a device is clearly out of spec (although if the device is drastically off, it probably shouldn’t be used at all). For many photographers, I'd argue a gray card or color checker passport would be a better first purchase than a cheaper calibrator.

Audio engineers have known this for years: mixing on good speakers, but also testing on a cheap pair of speakers. If the reaction to Game of Throne’s recent episode is anything to go by, videographers and photographers need to adopt the practice.

Lead image courtesy of Victoria Heath

Log in or register to post comments

37 Comments

Previous comments

This reminds me of how my dad, who used to design websites, would have three monitors, one was a really nice one, one was decent and one was super crappy and ancient. He made sure that the websites looked good on all three before publishing anything.

Lee Christiansen's picture

I'm writing from the point of view as a time served DoP, and with some grading experience, (we graded some of the last Sex Pistols DVD with success...)

There is a difference between a dark scene, and a scene which is dark. Whilst maintaining the feel and vibe of the drama, DoP's need to retain a sense of highlights and specular lighting which defines subject material. Just making it all dark, (because "it is dark") doesn't really cut it. Seems to me that the episode in question failed in that it lacked those contrasts and edges to the lighting which maintains a sense of darkness but allows the viewer to see what is happening - so we've got a failure at the shooting and conceptual stage.

As to the grade... well it is all fine to monitor on a £40K super screen - I'd expect no less. But this gives a perfect view or an optimised version to deliver premium quality.

Where the skill lies is to deliver this quality, but to still allow lesser monitoring options to deliver an acceptable (if lesser) experience.

In part this inevitably means a compromise - but this compromise is reduced if the source material has been shot to allow for a variety of viewing experiences.

I'd expect people need not to have sunlight hitting their screens for this viewing, but it seems to me that the level of perfect viewing needed to be more so. (I have my trusty plasma TV set at moderate contrast levels but I tend to view in a darkened room anyway with no visual distractions).

When I deliver photographic images for a commercial project, I always enjoy playing dangerously with dark tones - but I remember that pages will not always have great lighting, or screens will be set up badly, or maybe e-delivery will hurt my delicate details... so I shoot to ensure things don't get lost and I'm wary of being too "arty" if it will hurt 50% of my viewing client base.

30 years ago I ran a 24-track recording studio and freelanced in others. In some of the bigger studios we would monitor on £30K monitors which were sonically perfect - this was to "track" the recording and be sure that we had everything perfect. But after that we would monitor on a pair of near field speakers. These are designed to have excellent spacial performance and neutral delivery, but were no means a perfect representation of what was on tape. But even then, we would check in mono, and check on a pair of Auratones (a crazy priced, very basic 5" single speaker which was limited in high and low frequency replication). It would never sound great on the Auratones, but if the bass disappeared completely we'd know something was wrong. (Andy McClusky from OMD would often put my two Auratones together and monitor at very low levels to check things).

By the way, the subject of the Yamaha NS10M's was brought up on this thread. People may find this link of interest: https://www.soundonsound.com/reviews/yamaha-ns10-story

Game of Thrones is about as commercial as it gets. Art is great and we strive for visual excellence but there is a paying base that allows the programme to be made, and thought should be made as to how things will be received without a specific set of viewing instructions being needed.

Bryce Milton's picture

We deal with this in the software world as well - Network latency varies wildly so an experience you develop on a fast connection needs to be tested on the networking equivalent of tins cans and a string because that's how many users in the world will experience it.

Eric Crudup's picture

I'm just viewed the episode on my phone and it's wayyyy more visible now than the night it aired. I viewed the dragon scenes on my phone, my TV, and my calibrated photography monitor that night and couldn't see anything during some scenes, most notably the dragon fight scenes. Now it is dark but visible, even on my phone.

They obviously changed something. I'm guessing the encode was messed up or they throttled the bitrate severely to account for server load. I don't think the issue was non-calibrated monitors at all.

The cinematographer made a mistake, he blew it.... There's no way I wont calibrate my screen. This is what ensures my prints, albums and other wall art will look the way I intended them to look.

Nick Rains's picture

Clickbaity and misleading title. The article is quite interesting but the author states that he does in fact colour-manage his monitors. There are loads of solid reasons to do this - watching GoT is just not one of them.

Mark Myers's picture

I've been in the industry long enough to remember that NTSC used to frequently translated as "Never The Same Color." Yes you should always strive for the best possible quality, but if you produce a product that 80% of your client's viewers can see, you'r wasting your client's money and doing then a disservice. Doing art is great but the bottom line is that someone is supposed to be making money. On some projects it may be ok to shoot for the top 10% of viewers, but for most you want to engage as many eyeballs as possible.