One Driver Option Could Be Ruining Your Monitor's Accuracy

One Driver Option Could Be Ruining Your Monitor's Accuracy

One important setting can drastically affect your computer monitor’s accuracy. It’s important to check, but is almost hidden in the menus.

On Windows, both Nvidia and AMD graphics cards can cause diminished color and luminance accuracy if configured incorrectly. The setting in question has a different name for Nvidia and AMD utilities, but has to do with how the driver treats the signal over HDMI. Though this problem isn’t limited to just HDMI connections, as it can also affect DisplayPort users. Nvidia calls it output dynamic range, while on AMD, it is controlled by the pixel format option. In both cases, the card is outputting a degraded or less compatible signal to the monitor, as a result of the card defaulting to a format better suited for HDTVs.

On Nvidia cards, HDMI and DisplayPort connections can both default to limited dynamic range mode. This truncated set of RGB values washes out color and luminance. Since the driver considers the monitor to be an HDTV, even if it isn’t, it attempts to send a signal appropriate for that device. This behavior is not only the default for many installations, but can also revert after driver and software updates.

The problem is easy to correct, however. With Nvidia drivers installed, right-click on the desktop and select NVIDIA Control Panel. With the panel open, select Change resolution on the left side, and scroll down until you see Output dynamic range (changing to Nvidia color settings may be necessary). In this box, you can use the dropdown to change from limited to full dynamic range.

The same problem can also affect AMD cards, but to a lesser extent. On AMD, the card may default to YCbCr instead of RGB pixel formats. This has a more subtle impact on the colors displayed by the monitor, as compared to the Nvidia issue.

In newer drivers, open AMD Radeon Settings, and then change pixel format under the Display setting to “RGB 4:4:4… Full RGB.”

I’ve seen this setting revert after driver updates and Windows updates, so make sure to check it periodically. It’s a quick problem to fix, but I find the very existence of it frustrating. Most users have their PCs hooked up to monitors, so having the software default to a TV-oriented setting seems foolish. Have you been affected by this bug?

Lead image by rawkkim via Unsplash.

Log in or register to post comments

21 Comments

John de la Bastide's picture

Thanks...Mine was set to limited!

Alex Coleman's picture

Happy to hear it helped! If you've calibrated your monitor, you may want to run the calibration again now.

Johnny Rico's picture

So mine is set to the "use default color settings" and everything seems to profile fine in SpectraView. This is the first time I've ever heard of this, dont have time atm to see if there is a difference. Can you explain it further in that regard.

Alex Coleman's picture

Are you on Windows 10? If so, you should be OK. It seems that Windows handles it better than the driver defaults.

You can confirm this by switching from default to Nvidia color settings, and checking that the output is set to full. Once it is set, if you don't notice a difference- particularly apparent in the dark areas of an image, as they will look gray or blocky, you should be good.

Ed Sanford's picture

Mine was set correctly... I am using SpectraView software with my NEC 4K monitor. I believe that SpectraView is so good that it probably changed the setting to the best.

Alex Coleman's picture

Hey Ed- if you're at the point that you're calibrating your monitor, I wouldn't expect this to be an issue. It may be that NEC's monitor passes the correct information to the card to prevent this- my higher end monitor configures correctly, but some cheaper Dell displays have to be set every time.

Dave Nunez-Delgado's picture

Both my Asus ProArt editing display and my Asus ROG Swift gaming monitor were set correctly by default, thankfully.

William Faucher's picture

I noticed this way back in the day I got my first monitor with that had an HDMI cable. I noticed all my photos looked... off? Not wrong, or terrible, just... slightly off. Blacks didnt look black, whites didnt look white, drove me up the wall. Then I found this hidden in the menus, made my day.

While not news to me, thanks for putting it out there for others to see!

Alex Coleman's picture

It's quite the problem, especially if you don't know where to fix it. Thanks for sharing your experience.

Bill Peppas's picture

I've never seen it default to the Limited Range (16-235) with computer monitors.
It is however very reasonable to default in 16-235 when you are using a TV or a Projector with your PC as these devices were built for those levels since the movie industry standard uses that range for its productions.

William Faucher's picture

Can confirm it defaults to limited range even with PC monitors, not TVs. Seen this happen multiple times with Dell Ultrasharps.

Bill Peppas's picture

I'd love to see that happen.
I'm using Dell UltraSharps and UltraSharp Premium myself, and never had it happen.
Also never seen it happen on any of my 200+ clients computer builds I built the past few years.

greg tennyson's picture

Yeah... I'll stick to Apple and endure the fanboy comments.

Holy cow! Mine was set to limited and this has changed everything! I was dissapointed in my monitors dynamic range - which I bought before getting into photography - and just thought is was outdated. This has improved things so much, thank you!

Thank you Alex.
Anyone knows if that is a problem in Mac computers too ?

Alex Coleman's picture

Very different situation with the drivers. Apple's graphics drivers are core to OSX, and I've not heard of an issue like this affecting them. Edge case may be Quardro cards.

John Verner's picture

In my NVIDIA Control Panel there is a setting for Color Depth, but nothing for Output Color Format, Output Color Depth or Output Dynamic Range! WTF? I'm on Win10, NEC monitor (Spectraview). Thx.

🤔

I don't see the Change Resolution option for my laptops. Do I need to have an external monitor hooked up to HDMI before I see it?

Mine was set to limited! a did set it to Full then i recalibrated!