Can You See the Difference Between 8K and 1080p Video?

Many are excited for shooting in 8K with Canon's new mirrorless, but how noticeable is it in real terms? Can you even tell the difference between footage shot in 8K and 1080p?

First and foremost, before you watch this video, it's worth changing the video quality to 8K, but even then there's a lot of factors to note. Firstly, YouTube uses compression. Secondly, most devices don't have 8K resolution available. The problem is, that's an argument in and of itself. If most people can't view 8K natively, how will 1080p look when exported in 8K? The use of the video you're creating will typically dictate what's needed from your resolution and 8K will unquestionably have more detail than 1080p and even 4K, but for the viewer to tell, it'll have to be under specific circumstances.

Video is more complicated that it might first appear. It wasn't until this year I even realized that many YouTubers are exporting 1080p videos in 4K for the platform, and almost no one can tell. By extension, 8K will likely be further overkill. That isn't to say that shooting in 8K doesn't have its perks. For example, being able to crop footage in and still retain fantastic resolution is valuable. There are of course other technical benefits, but for almost every modern use of video, can you tell the difference between video filmed in 8K and exported in 8K, and video filmed in 1080p and exported in 8K? On most online video platforms like YouTube and Vimeo, the answer is almost certainly no asides from some very cultured eyes and careful pixel peeping. With the right screen for viewing 8K, it'll clearly win, but that just isn't currently very common.

What do you make of the difference? Can you spot it? Do you not care as the other perks of 8K are the selling point for you? Is it about futureproofing? Share your thoughts in the comments.

Rob Baggs's picture

Robert K Baggs is a professional portrait and commercial photographer, educator, and consultant from England. Robert has a First-Class degree in Philosophy and a Master's by Research. In 2015 Robert's work on plagiarism in photography was published as part of several universities' photography degree syllabuses.

Log in or register to post comments

1080 and 8k, def can see a difference. 4k and 8k, not at all but I'm only on a 4k screens so its not like I am seeing 4k boosted to 8k, rather I am looking at 8k reduced to 4k. I think on a big 8k screen the differences will start to become more pronounced.

I do notice quite a difference between 1080 and 4k on my 4k screens. I often find myself watching a youtube video and being like: "wow that looks really good" and then realizing it was uploaded in 4k. I usually don't consciously notice: "oh hey that's 4k" but my brain does recognize that something about the video is making it look better than the 1080p that I am used to watching.

My question is what do clients want? My experience has been that most of my clients are happy with 1080.

Fully agree. Pretty much at the most 1080. I frequently get requests for lower resolution or am forced to scale and compress to meet file size restrictions of host platforms. Can you even get someone to host streaming 8k? In my experience, most commercial clients look for free or inexpensive hosting services.

shame. this guy is such a peter mckinnon wannabe

He's literally best friends with Peter. They shared an office together for a good long while. They vlog together. He's not a wannabe, he's up there with him at his side. And nearing 1 million subscribers, I'd say he's doing rather well for himself.

They each do their own thing, and Matti doesn't work for PM. They shared the same office, but that's about it. By your logic, anyone who's friends with Pete is a wannabe.

As most netizens use hand held devices to view content the fact that the number of such devices can currently be counted using the fingers of one hand, means that a more realistic question would be:
Do you view Youtube on a 4k desk top monitor and do you notice any differences in resolution of the content?
Every Google search I do comes up with a caveat that hand held devices are unlikely to sport 8K displays due to the power drain even using OLED displays.
So the answer to your question is 90%+ of the viewing audience can't possibly see any difference.

Honestly I can see the difference between Youtube 1080p and 4k even when viewing on a 1080p screen. In the case of Youtube and most streaming platforms, the difference is more in the reduced compression artifacts that come with their higher resolution streams and less in the raw resolution/detail. Their 1080p stream isn't anywhere near the visual quality it could be at that resolution.

With high quality codecs, you can actually see a difference between 4k and 4k downsampled to 1080p (when viewed on a 4k display). It's obvious for stationary high contrast things like text, very subtle for real-world stationary subjects, and even tougher for things in motion. Not really convinced 8k is useful (or visually distinguishable from 4k) as a delivery resolution, though resolutions larger than 4k are useful for capture.

2 things I consider when choosing resolution prior to pushing the red button.
Delivery spec.
What does the contract say is the resolution specified for delivery.
Nobody that I've worked with yet, has asked for 4k, or 8k.
Is this a green screen shoot?
If you've ever done it, you already know the answer.
Shoot in the highest resolution available to you.
Can you tell the difference?
Maybe, lots of variables can affect detail and color reproduction.
The fact that the camera has 8k doesn't necessarily mean you're really getting that resolution.
If you don't have a monitor with that high a resolution to use when editing it, why would you shoot in that resolution.

I understand shooting higher rez and downsizing after will result in sharper, cleaner images. Like shooting 4k even if the delivery specs are 1080p. But 8k is becoming such a ridiculous size, not to mention insanely high I/O that I personally wouldn't even want to work with 8k files. Maybe I can tell a difference, but would it make it better? Probably not.

Most people don't even have 4k TV's, most tv shows aren't in 4k either. 8k is nice, but that's quite a few years ahead of the curve.

Scaling is a good point. Still there are lots of variables that will affect those results. How much of the frame is going to be the point of focus? How much detail do you want there? Depends. There's also the amount of computing power required. There are work arounds for that too, Proxies. That opens another can of worms. Even more storage space required, more time to transcode and distribute media. Educating clients about resolution differences between online and offline editing. Equipment costs, Just looked on B&H for 8K monitors. Something professional is in the 10K range.
Maybe someday, but not anytime soon.

Bayer sensors often resolve substantially less than their advertised resolution. If you're delivering in 1080, you'll probably want to shoot on a sensor that's at least like 3k or so if you really want an image that makes full use of that delivery resolution. So yeah, shooting 4k and downsampling to 1080p is likely to give a more detailed image than a 1080p sensor would, unless your camera doesn't have a bayer filter.

And I work in VFX. We absolutely shoot 4k and deliver 1080, because the results ARE better. A comparison in photoshop takes 2 seconds. That interpolation you speak of results in a crisper image, and reduces noise. I agree sharpness isn't necessarily what you want, but it's easier to soften an image that's too sharp, than the other way around.

This is the case for compositing, visual effects/CGI. But perhaps things are different if you're doing little to no post work.

Maybe its just me but it seems odd that the fstopper author takes credit for his post but didnt mention/credit once Matti Haapoja's name in the 4 paragraph story

One is a great advance in TV picture quality and the other is a great advance in marketing scams.

Sure, 8k looks sharper with still pictures, but shooting at 24/25 or 30 FPS is an imbalance between static and temporal resolution. Anything moving in the scene will have motion blur, the 8k resolution will faithfully reproduce the motion blur. So anything moving goes soft. This is contrary to what we perceive, the resolution is varying with the motion. That’s why UHDTV uses frame rates up to 120fps for 8K. 24 FPS and 8k just don’t go together, it’s a waste of data and bandwidth.
8k looks great in the marketing collateral, but does it really create better looking ‘cinematic’ images?

both videos and picture should be upload in multiple resolutions (if you are self hosting you work) anyway since all browser and web tech can detect screen size there is no reason to be displaying anything larger then you have too or you just are slowing down you site, which has negative seo effects.

on a FHD screen (1920*1080) i can't really see any difference if i'm not told what to look.

Yes I can.
However, to me, 1080p means 2 MILLION pixels per frame.
No youtube "1080p" video has that much data, it's so compressed it would be more appropriate to call it 360-blocks.
Youtube even changed their compression algorithm years ago to be much more aggressive.
I'm only surprised that Hollywood let that happen to their uploads, because most trailers look worse now than 10 years ago. 🤨

PS: No one needs an 8K selfie cam, that includes youtubers.

There are some youtubers who tell us how great they are, and how much better they are than we are. When in reality, they are not.

Can I notice a difference between 4K and 1080p - Yes!

Does it make impactful difference while viewing the video - Absolutely Not!

Content is King and 1080p is good enough. If the content grabs you by the collar and pulls you in, it really doesn't matter.

I'd put more focus on getting better audio than increasing video resolution.

I couldn't tell the difference. I was wondering when the comparisons would start, but, realized much of the video was the comparison. lol