Game Of Thrones Shows Why You Shouldn’t Calibrate Your Monitor

Game Of Thrones Shows Why You Shouldn’t Calibrate Your Monitor

Some photographers obsess over color, spending thousands on calibration equipment and high end monitors. A recent popular news story showed why this might not be worth it.

A few weeks ago, Game of Thrones aired “The Long Night,” an episode set entirely at night, featuring high quality CGI and budget-busting battle scenes. The day after, however, all everyone could talk about was how they couldn’t see any of it. The episode was very dimly lit, but surely a multimillion dollar production with award winning DPs knew how to shoot a night scene, right?

I’d argue they executed it perfectly. On a well calibrated TV, in a light controlled room, the episode looked fantastic. The shadows were inky black, but I never missed any of the action. It was only when I asked some friends, who mentioned that they couldn’t see anything, where they watched the episode that the complaints made more sense. They were watching in a brightly lit room, with whatever garish preset their TV defaulted to, not knowing that this episode would be a stress test of the black levels and dynamic range of their displays.

When questioned about the episode, the episode’s cinematographer Fabian Wagner cited viewer’s lighting conditions, video compression for streaming, and a variety of other reasons for the unfortunate experience many had. Essentially, his statement read that because he had an expensive, calibrated monitor, his way of looking at the episode was the only right way. Now I agree, he was right, and the episode was shot and graded correctly, but when your customers can’t view it, that doesn’t matter.

I think this entire issue goes beyond issues with any particular aspect of this episode, but instead reflects a broader issue that faces creative professionals. At the same time that creatives are adopting wider color spaces, HDR for video, and working with even higher resolution, consumers are viewing media in worse conditions. Some problems are even due to the intended behavior of the device- Apple’s True Tone can change the whitepoint mid-video, altering the carefully measured color grading, while some TVs will ramp brightness up or down in a misguided ECO mode.

While I still calibrate my monitors, I don’t rely on them as an absolute truth. With many quality monitors coming in at a Delta E below 2, right out of the box, there’s less of a need for calibration. Instead of trying to perfect calibration, make sure you understand what has a bigger impact on color: monitor color space, color temperature, gamma, and image color space. An error in any of those will have a much more significant impact than calibrated vs uncalibrated. Furthermore, an imperfect calibration is worse than none at all, and the professional calibration tools to do it right can be expensive.

Out of the box color accuracy is easy to get, with many monitors now advertising it as a feature.

I’ve made sure to test important images across a variety of display medium, and I’ve learned to accept that when the image has left my hands, so too has the control over how it is viewed. Calibration is still important, especially when working with multiple monitors or devices, for print and professional work, or if a device is clearly out of spec (although if the device is drastically off, it probably shouldn’t be used at all). For many photographers, I'd argue a gray card or color checker passport would be a better first purchase than a cheaper calibrator.

Audio engineers have known this for years: mixing on good speakers, but also testing on a cheap pair of speakers. If the reaction to Game of Throne’s recent episode is anything to go by, videographers and photographers need to adopt the practice.

Lead image courtesy of Victoria Heath

Alex Coleman's picture

Alex Coleman is a travel and landscape photographer. He teaches workshops in the American Southwest, with an emphasis on blending the artistic and technical sides of photography.

Log in or register to post comments
37 Comments

I’m still baffled how that episode made it to air. You always preview your images on a “control” display. Check it on a $500 TV. For still images I put on the web, I check it on my uncalibrated monitor and my phone. If it looks good on both I’m satisfied that’s going to work on most displays. The only real reason to calibrate for stills is to print.

You don't create art for the lowest denominator. Why punish viewers who spend the time and money to have a top notch viewing experience?
You can't control the end user's experience outside of certain theatre's.

Nothing about these last seasons was art. Second, you can produce tv episodes for whatever small portion of the viewing audience you want, but pretty sure HBO wants to make money, and they hold the purse strings.

So you can produce images that look perfectly dark on a calibrated monitor in perfect viewing conditions, but if 98% of people that look at them can’t see it, it’s still your fault.

I agree with your point about the 98%.. punishing your audience is rarely smart. But cinematically the last two seasons had plenty of amazing moments... just equally bad writing :/

If you don't think HBO reviewed the episode before it aired and approved it the way it was, think again. The colorist, director, producers and editors all saw that before hand and there is no way it wasn't approved by management beforehand. My money is on poorly executed transcoding for delivery.

"You don't create art for the lowest denominator."

When producing a commercial product, yes you do. This is why audio engineers always test their mixes on crap computer speakers sitting in their bedrooms, their car audio systems, and even mono speakers right alongside the multi-thousand-dollar monitoring setups that surround their board.

"Why punish viewers who spend the time and money to have a top notch viewing experience?"

Those people would have still been able to see the action if it was visible on a low end factory-preset TV. The only viewers who were "punished" were fans of the show who weren't AV-geeks or were not privileged enough to have high-end dedicated viewing environments to watch their TV shows in. It's not like they were shooting a Lord of the Rings movie where you could even argue that the theater is responsible. When you create a TV show, you KNOW that most people are just watching it in their living rooms or bedrooms.

"You can't control the end user's experience outside of certain theatre's."
This is EXACTLY why you produce for "the lowest common denominator" There's a world of difference between a slight loss of nuances (what you might experience in hearing an audio mix on a low-end vs. high-end system) vs. just not being able to make out large blocks of what the hell is going on at all.

"When producing a commercial product, yes you do."
I don't know who your clients are, but mine absolutely DO NOT want their project dumbed down.
I live in Nashville and work with many recording engineers and all of them I work with use very expensive studio monitors to create their work on.
I can see I touched a third rail here tied to GOT.
My apologies to any fans who feel they were cheated in their viewing experience for that episode.
I stand by my previous remarks about production values.
I've spent over 30 years championing high production values and I'm not likely to change.

"Dumbing down" your mix is completely different from ensuring that it translates to multiple platforms. Perhaps your clients are not the type who care about whether their songs get played inside car stereo systems or inside malls with their monaural output speakers, but I can assure you that best practice is to test your mix to make sure it doesn't collapse in an unintended way when you translate to a less-than-optimal listening environment.

Perhaps you are a recording engineer that doesn't worry about these things or are a mixing engineer that happens to not take it into account (heaven forbid if you're a mastering engineer that doesn't think about this stuff), but the point of commercial music is to be heard and if your drums or guitar suddenly collapse on a mall speaker because you didn't do your due diligence and make sure that your mix works with a mono output, then you've done your client a severe disservice no matter how wonderful it might sound in front of a pair of Genelecs.

The widespread use of Yamaha NS-10's as one of the de-facto standard speakers for monitoring audio mixes is precisely because they're crap and "anything that sounds good on an NS-10 will sound good anywhere", as the saying goes.

There will always be different philosophies regarding this and like you, there will always be purists. However, if you are producing for a mainstream audience, you're bound to have to make some sort of compromises in your mixing in order to take sub-optimal listening environments into account. The point is, however, that you're aware of it and when someone comes and asks you why the lead guitar sounds so thin on the radio vs. the massive sound that it had in your studio due to phasing issues or something involved in the conversion to an FM radio signal, you have an answer beyond "it's your fault for listening to it on the radio."

When you account for sub-optimal settings, things will get lost. That's unavoidable. Mostly, we're talking tertiary instruments and effects. You expect the meat of the mix to be there in all cases regardless of how or where the song is being played. In this particular GoT episode, whoever was in charge did not even manage that. Audiences were largely staring at blackness. That's like putting a song on the radio where the melody can't be heard because you need high end audio equipment to make it out. It's stupid.

There are thousands of different end users all using different TVs. It's impossible to anticipate and master something that will work best for all of them.
That is why you always master for the best image quality possible.
When I made websites, I could anticipate users using one of several major OS, and browsers.
TVs are much less consistent.
I am really sorry GOT did this to you fans, but that doesn't change the way we master for Broadcast.
I've been doing this for decades and the standards for mastering have only gotten higher.
With the increase of high end cameras and workstation software for producing programs, it's only going to get even higher.
I'm only repeating what industry standards are.
Engineers make those not viewers, or anybody else.

Nobody expects you to master for every single device and situation. You ARE, however expected to use avariety of different devices and environments (within reason) that constitutes a representative sample of likely venues.

In AR, that means testing your mix in mono, listening to it in a car, listening in a home stereo setup, etc. For a TV show, at bare minimum, it means checking it on a normal TV in a normal bedroom or living room.

Nobody us saying that you're supposed to be OPTIMIZING your master for these settings. You're just supposed to make sure that your master isn't going to fall apart in sub-optimal use and that the listener or viewer still gets all if the critical details even if some nuance might be lost.

When you have a massive outcry from a large portion viewers because they couldn't see what was going on, someone fucked up. I don't care how beautiful and nuanced it is in an optimal setting. Detail should never be lost to THAT extent if they were doing their jobs properly.

Actually no. That is not the standard for film or television.
You calibrate the equipment to industry standards.
You use Waveform monitor, Vectorscope, and Histogram.
You set all you levels to SMPTE standards and print it.
It really is the only standard for Broadcast.
Unlike the internet where there are virtually no standards.
Also, that is not how any professional recording engineer I've ever worked with does it.

Recording engineers wouldn't. As a recording engineer, your only job is to capture the source. Its the mixing and mastering engineers' jobs to put those sources into place in a manner that works for the listener.

Hey Michael, other than professional troll, what kind of experience do you have related to mixing music, or grading video?
Nice profile bio.
You'll understand if I ignore you from here out.

No experience grading video, but I have worked professionally as an audio engineer. I was mostly a recording engineer and I got out of the industry because I didn't have the ear for mixing and it's pretty much a zombie industry compared to what it once was. There's also a massive saturation of talent for the number of stable jobs available in recording studios. I knew quickly enough that I wouldn't make the cut.

That being said, I know my theory and I've worked with mixing engineers as well as had conversations with mastering engineers. You don't need extensive experience to have enough common sense to know that details need to be preserved for sub-optimal conditions. Everything I described from listening to a mix in mono to burning a CD to see how it sounds in a car or listening through earbuds on a iPod was all stuff that we regularly did to make sure the important details were all there regardless of where someone would hear the music.

Good job with the ad hominem attack to deflect from the point while simultaneously outing yourself as a pretentious asshole, though. Not entirely sure where that suddenly came from since I don't recall getting personal with you, but no matter. Something about this all clearly ruffled your feathers and I honestly don't care enough to know what it is. You could have left well enough alone without the personal insult, though. Just sayin'

I'd just like you to note that you're the one basically sitting there and defending the practices and mindset that led to millions of viewers staring at a black screen wondering WTF was going on when they were supposed to be watching television show. Likewise, you'll understand if I ignore you from here on out. 👍

I read your bio, that set me off.

Oh, well. My apologies for that. The bio is a bit of a self-defense mechanism. You see, I actually suffer from depression, anxiety, and low self-esteem, but at the same time I have a tendency to vent online which puts me in bad spots. I figure that by putting into place something where people won't take me seriously anyway, it helps soften the blow (for me). It's difficult to explain even to my own therapist, really...

Anyway, take care.

You always mix with highest quality available, but you master for your audience. Recording engineers are not the audience.

My audience includes people with 4K TVs. Should I master instead for people that still have 4x3 SD televisions as they are still out there in the audience?
My clients would not agree with you and they are the ones paying the bills.
I feel sorry for your clients.

That’s essentially what GOT did. Their are probably as many people with CRT screens and HDR screens. They mastered for the minority.

Personally, I don't think it was the grade. I believe someone screwed up the encoding for delivery.

Regardless, someone dropped the ball hard and it wasn't the viewers with sub-optimal viewing conditions.

What everyone else said.

I'd say it's much simpler than story about calibration - when you shoot a long dark scene, you should factor in where people watch it a bit more than usual. And then make it even brighter just to be sure, that's it. It's like when you make a print of a night skyline, but don't consider it'll hang during the daylight and the painting doesn't have led lights under it.

Never watched it.
Chuckled reading this though as just a few posts ago there are the pixel peepers wet dreaming over the new and slightly pricey monitor from the famous fruit company.

They should have put a *for optimal viewing experience please turn lights off* warming at the beginning.

That would have helped but the episode would still be challenging for lower quality TVs, and that's before you consider how badly HBOs compression hurts the video quality.

Yes. I’d love to see the 4K HDR Blu-ray version on a higher end TV some day, but that’s not how HBO delivered it.

I am actually looking forward to doing this myself.

That episode was just a boneheaded display of multiple technical mistakes.

The fact that they defended it the way they did shows they don't care at all about their consumers.

Even with an good setup (I have a good TV, set appropriately, in a dim room with controllable bias lighting) the episode was still severely impacted by HBOs outdated compression. I'm confident that Netflix for instance could have delivered a much better visual result. Everything that was meant to be visible was, but too much of the episode was awash in severe banding.

I still believe in calibration and having a good monitor at your workstation, but I also know the pitfall of producing challenging content that may only look good on my display.

Use a calibrated monitor but keep in mind consumers often use one with slightly more brightness and saturation. Also, go to an electronics store with computers, laptops, tablets and mobile phones and make note of how your website look on those units and adjust accordingly.

This reminds me of how my dad, who used to design websites, would have three monitors, one was a really nice one, one was decent and one was super crappy and ancient. He made sure that the websites looked good on all three before publishing anything.

We deal with this in the software world as well - Network latency varies wildly so an experience you develop on a fast connection needs to be tested on the networking equivalent of tins cans and a string because that's how many users in the world will experience it.

I'm just viewed the episode on my phone and it's wayyyy more visible now than the night it aired. I viewed the dragon scenes on my phone, my TV, and my calibrated photography monitor that night and couldn't see anything during some scenes, most notably the dragon fight scenes. Now it is dark but visible, even on my phone.

They obviously changed something. I'm guessing the encode was messed up or they throttled the bitrate severely to account for server load. I don't think the issue was non-calibrated monitors at all.

The cinematographer made a mistake, he blew it.... There's no way I wont calibrate my screen. This is what ensures my prints, albums and other wall art will look the way I intended them to look.

Clickbaity and misleading title. The article is quite interesting but the author states that he does in fact colour-manage his monitors. There are loads of solid reasons to do this - watching GoT is just not one of them.

I've been in the industry long enough to remember that NTSC used to frequently translated as "Never The Same Color." Yes you should always strive for the best possible quality, but if you produce a product that 80% of your client's viewers can see, you'r wasting your client's money and doing then a disservice. Doing art is great but the bottom line is that someone is supposed to be making money. On some projects it may be ok to shoot for the top 10% of viewers, but for most you want to engage as many eyeballs as possible.