What's the Frame Rate of the Human Eye?

Have you ever wondered what the frame rate of the human eye is and, just importantly, why it matters? It turns out the answer is far more complex than a simple numerical value, and that has tremendous consequences for the design of cameras and why we work the way we do. This neat and informative video takes you behind the scenes of the science behind how the eyes sees to discuss the topic and the implications it has for filmmakers. 

Coming to you from Filmmaker IQ, this great video discusses the frame rate of the human eye. The short answer is that the human eye does not really have a frame rate, but for practical intents and purposes, it is about 10 fps. This is an important value to keep in mind, as videos are nothing more than a sequence of moving pictures, and as such, the pictures need to change frequently enough that they outpace the eye and thus appear to be continuous movement. This is why the minimum is about 24 fps and why below that frame rate, you will start to notice a bit of choppiness in video. There's a lot more to the topic, though, so check out the video above for the full rundown!

Alex Cooke's picture

Alex Cooke is a Cleveland-based portrait, events, and landscape photographer. He holds an M.S. in Applied Mathematics and a doctorate in Music Composition. He is also an avid equestrian.

Log in or register to post comments
21 Comments

This magic number turns out to be important for still photography, as well!

If you want something to look about as blurry as it would look "to the eye," you should set your shutter speed to about 1/10th of a second — the inverse of ten frames per second.

This also turns out to be pretty boring, as that's the way things look to us most of the time. A waterfall will have blurry blobs of water, rather than either the "frozen" look of a much higher speed, or the "silky" look of a much slower speed.

But sometimes, it's what you need to do a proper depiction for a specific need.

10 frames per second is exactly the same as a shutter speed of 1/10 of a second. One frame is 1/10 of a second when it's 10 frames per second. The inverse would be a 10 second shutter, which would probably be enough to get that nice silky look of the classic time-lapse water photograph.

Your reference to blur as it looks "to the eye" is remarkably meaningless. That's why I'm remarking on it. It's bizarre to me that you took the time to leave a comment that doesn't actually reference reality.

I guess I could be wrong in my assumption that your eyes work like mine do. To my eyes, at least, the world is a pretty crisp place. Things get blurry if I put them too close to my nose. Otherwise, not so much.

It's true, though, that sometimes, a shutter speed of 1/10 of a second is correct. It's nice of you to validate the inclusion of the 1/10-second setting on cameras worldwide.

Your mind is tricking you to thinking everything is crisp.
But if you were to actually pay attention you would be amazed at how blurry your life really is.

I started shooting 8 mm movies when I was about 13 years old in 1966. One of the most disappointing things to me was the frame rate of 16 frames a second which became 18 frames per second later. I just skipped all that and went right to 24 frames per second. Problem solved.

That is not correct.
First of all - author has not provided any evidences for the theory. Neither would I...
Secondly, the human eye is only a small part of a complex system - and even the eye has more than just a single value for "frame rate".
The 👀 s a result of long evolution. We were a prey for the predators thus we had to avoid them and our eye can see movement on the side of sight. I heard that this side vision cam detect images as fast as 1/400 of a second. This would lead to rate of "400 frames per second" to be noticed (not seen). But we were also hunters thus in front of us we gained the ability to see picture with high resolution but lower rate. Lens are not perfect thus the picture is not perfect either. Internally eye has a nerve (like a cable) which impacts the vision - thus we have a blank spot. It also learned how to remove things like our nose and correct some image artifacts - this is done during post-processing. The more processing, the lower "rate". And there is not a single value for it as we are still evolving - in the past this value was assumed to be 18 frames per second but for many it might be higher. Also lightning conditions are impacting how we see.
Overall the human eye has no "frame rate" but can register pictures with variable resolution and speed as quick as 2.5ms (usually much less).
Similar way you might ask how thick a book page is. The cover is thicker than a page and every book can have different page sheet thickness.

I agree with everything but your first sentence!

Particularly, I certainly agree that human vision is a tremendously complex thing that has a whole bunch of "yeabuts" that we could get wrapped up in forever. The eye has very little in common with a camera sensor!

And yet, it's useful to be able to abstract things in a way that is easily used. We can say, "A car goes about 88 feet per second," but then someone will chip in, "Yeabut, what if that car is on a neutron star that is orbiting another neutron star at 99% of the speed of light?"

The fovea centralis corresponds to our "cone of inspection" field-of-view, and consists primarily of cones, which are colour-sensitive, slow-response light receptors. This is the angle-of-view that corresponds to where our visual acuity is greatest, with which we read, inspect small objects, and look at movies. So I think it is perfectly fair to approximate an abstract "ten frames a second" value for the purpose of creating and controlling photographic and motion picture effects.

It may certainly be true that other areas of the eye that are rod-intensive may have a much faster response time, but we typically don't take photos out of the corners of our eye, do we? Nor do we look at movies out of the corners of our eyes!

Yeah, I don't have a clue where the notion that the human eye has a frame rate of 10 frames per second came from.

If I had a frame rate of 10 frames per second in my eyes, I don't think I'd be able to tell almost immediately when I'm seeing high frame rate video. It takes less than 1 second to recognize 60 frames per second versus the normal 24 or 30.

Also, we should consider the fact that computer monitors, especially gaming monitors, have frame rates of up to 144 Hz or more. Again, that would be utterly pointless if we couldn't even tell the difference beyond 10, or 24, frames per second.

Take a photo at 1/10th shutter speed of a computer monitor. Compare different refresh rates.

A 10fps camera can distinguish between refresh rates. Being able to distinguish in no way tells you what The frame rate of the eye

How do people continually make the error that being able to detect an image at 1/400th a second leads to a frame rate of 400fps?

This is a photography site? Do photographers think setting your strobe light to 1/400th a second mean you need to record it with a 400fps camera in order to see it on screen?

Eye doesn't have framerate. It captures light stream at 790THz, so in a way it sees at 790 trillion fps. That's how you look at eye's framerate. Eye doesn't scan retina like cameras scan cmos, it's constantly streaming raw data to your brain. Your brain doesn't have framerate either. Your brain constantly tries to use the least amount of information that is required to understand surroundings, but it doesn't cut it in images per second it analyzes motion and color changes etc. and prioritizes eventa that trigger something interesting. We would have to know how fast your neurons are firing the data from your eyes to your brain. Pennsylvania School of Medicine estimates that eyes send 10 million bits of data per second to your brain, but that doesn't translate to framerate either. US army has conducted test that can in some way be relative to framerate. They flashed images from 100/1 to several thousandth of a second and some of the tested people were able to recognize shown objects even at 1200th of a second flash. Take this information how you like, it's just not possible to think human capability to see as framerate.

"it's just not possible to think human capability to see as framerate."

I sure am glad Philo Farnsworth didn't "see it" that way!

I agree that the eye-brain combination is not at all like a raster-scan. Yet, there certainly exists an "effective" or "equivalent" frame-rate for human vision.

Otherwise, things like movies, television, and computer monitors simply wouldn't work properly.

True.
If you look at a person standing still and they move their hand to wave it will be a blur if you actually pay attention.
But as noted the brain does amazing interpretations amd subconsciously you are not conscious of the the blur.
You can hold still and not ne seen by someone walking right by you if you hold still in a non-human distorted shape though in plain sight.
Training and paying attention are critical aspects of what we "See".

The key important detail is that there is no clock, no sync signal like you have in all technologies for which "frame rate" is a meaningful term.

The photoreceptors in the eyes have a fire/reset/fire cycle time in the order of 5 to 20 cycles per second, with 100ms being a pretty reasonable representation. And indeed neurons themselves have a cycle time of around 100ms, giving your entire brain a "10fps" frame rate; so your brain is in theory a sort of 10Hz analog computer. Except... except...

These pulses aren't synchronized. After a photoreceptor fires, the one right next door could fire 1ms later and represent a slightly modified scene. And your brain is wired to make sense of this; different colors fire as slightly different rates or thresholds, and color and brightness information arrived at different times. In fact, your peripheral vision is delivered a full 20th of a second sooner than your focused area.

Altogether, this gives us the flicker-fusion rate we know and love, but also gives us the ability to discern much higher frame rates by using the timing offset from different nerves to see that the scene has changed "in between frames" of the primary vision cells we're using to decode the scene.

And the more of your focus, the more of your vision, you dedicate to the task, the more you can discern. That's why you can tell the difference between frame rates "if you try hard" that wouldn't be discernable at a casual viewing.

10 fps is the minimum frame rate that the visual processing can smoothe through extrapolation, though at great expense, not the maximum frame rate of the high. I think the best frame rate, still not the maximum, would be that which provides for the most comfort. An IBM study back around 2000 zeroed in at 300 hz as the point at which they no longer saw comfort increases as frame rate went up. The machine is arguably found by the fastest perceivable motion. It is even higher.

The human eye is NOT limited to 10 fps. I can tell 120 hz from 60 hz tvs, let alone 24 fps from 60.
This is utter nonsense.

It was limited before because the Hz limitation of the monitors. People should just stop with the Hz talks. There are 240Hz that looks way smoother than even 120Hz.

It amazes me that people do not understand that human eyes obviously have a frame rate. Go outside and fan your fingers in front of you and spin them back and forth. You notices they seem to flicker? That's because your eyes are capturing the them in frames.

Also, the 1/400 thing is stupid because if the frame exists in time and your eye can detect the light from the source, then even if your eyes DID operate at 10 fps, 1 of those frames should capture it.

Now I know it's faster than about ten frames. I would say the easiest way to find out is to take a fan and paint each blade a different color. Attach it to an oscillating speed controller and speed it up little by little till it appears the blades are standing still. The number of revolutions per second will give you your basis for calculating fps.

To track the fan blades correctly you will have to do the test outside. Any artificial light source will taint the test.

That's essentially what Schouten did in the experiment talked about in the video. This was conducted in the 60s with continuous light. The rotation for the first standing wave was around 8-12hz.

Pretty close to 10fps.

Still though, the eye has no frame rate. Because we can all agree we don't experience life at 10fps, even if that number does continue to pop up.

Eyes are analog, not digital. They work based on a continuous range of signals that the brain interprets, not packets of data.
Even if you wanted to compare it to a framerate: so many things change how that data is interpreted in your brain. How alert you are, how focused you are, where in your field of vision the action is occuring, even how much ambient light there is.

The eye doesn't have a framerate. Saying so is about as accurate as measuring the detail of a painting in pixels.

yo who wrote this 😂 my eyes don’t lag… and how am i able to see the difference between 10 fps and 60 fps then?? this is just nonsense