How many frames per second can the human eye really see?
30 fps? 60 fps? If you've ever debated framerates, the cognitive researchers we spoke to have some complex answers for you.
I spend far too many of my first tender minutes in a new game with a framerate counter running in the corner of my screen. I play, hyper-sensitive to the smallest hitches, dipping in and out of the graphics settings to optimise, and worry, and optimise and worry again.
I swear I don’t have that counter going all the time. That would be unhealthy, right? But framerate is important to us. It’s the core measurement by which we rate both our rigs and a game’s technical chops. And why not? A framerate counter doesn’t lie. It reports a straight, simple number. In an uncertain world it’s something we can stand by.
But can you see high framerates? So starts an argument as old as PC games, a constant and confused war in which pride clashes against shaky science. But internet rage aside, it’s an interesting question, especially since it engages with the primary way we experience computer games. What is the maximum framerate the human eye see? How perceptible is the difference between 30 Hz and 60 Hz? Between 60 Hz and 144 Hz? After what point is it pointless to display a game any faster?
The answer is complex and rather untidy. You might not agree with parts of it; some may even make you angry. Eye and visual cognition experts, even those who play games themselves, may well have a very different perspective than you about what’s important about the flowing imagery computers and monitors display. But human sight and perception is a strange and complicated thing, and it doesn’t quite work like it feels.
Aspects of vision
The first thing to understand is that we perceive different aspects of vision differently. Detecting motion is not the same as detecting light. Another thing is that different parts of the eye perform differently. The centre of your vision is good at different stuff than the periphery. And another thing is that there are natural, physical limits to what we can perceive. It takes time for the light that passes through your cornea to become information on which your brain can act, and our brains can only process that information at a certain speed.
Yet another important concept: the whole of what we perceive is greater than what any one element of our visual system can achieve. This point is fundamental to understanding our perception of vision.
“You can’t predict the behaviour of the whole system based on one cell, or one neuron,” Jordan DeLong tells me. DeLong is assistant professor of psychology at St Joseph’s College in Rensselaer, and the majority of his research is on visual systems. “We can actually perceive things, like the width of a line or two lines aligning, smaller than what an individual neuron can do, and that’s because we’re averaging over thousands and thousands of neurons. Your brain’s actually way more accurate than one individual part of it.”
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
And finally, we’re special. Computer game players have some of the best eyes around. “If you’re working with gamers, you’re working with a really weird population of people who are probably operating close to maximal levels,” says DeLong. That’s because visual perception can be trained, and action games are particularly good at training vision.
“[Games are] unique, one of the only ways to massively increase almost all aspects of your vision, so contrast sensitivity, attention abilities and multiple object tracking,” Adrien Chopin, a post-doc researcher in cognitive sciences, tells me. So good, in fact, that games are being used in visual therapies.
So before you get mad about researchers talking about what framerates you can and can't perceive, pat yourself on the back: if you play action-heavy games, you're likely more perceptive of framerates than the average person.
Perceiving motion
Now let’s get to some numbers. The first thing to think about is flicker frequency. Most people perceive a flickering light source as steady illumination at a rate of 50 to 60 times a second, or hertz. Some people can detect a slight flicker in a 60 Hz fluorescent lightbulb, and most people will see flickery smears across their vision if they make a rapid eye movement when looking at the modulated LED tail lights found in many modern cars.
But this only offers part of the puzzle when it comes to perceiving flowing smooth game footage. And if you’ve heard about studies on fighter pilots in which they’ve demonstrated an ability to perceive an image flashed on the screen for 1/250th of a second, that’s also not quite what perception of smooth, flowing computer game imagery is about. That’s because games output moving images, and therefore invoke different visual systems to the ones that simply process light.
As an example, there’s this thing called Bloch's law. “Basically, it’s one of the few laws in perception,” Professor Thomas Busey, associate department chair at Indiana University’s Department of Psychological and Brain Sciences, tells me. It says that there’s a trade-off between intensity and duration in a flash of light lasting less than 100ms. You can have a nanosecond of incredibly bright light and it will appear the same as a tenth of a second of dim light. “In general, people can’t distinguish between short, bright and long, dim stimuli within a tenth of a second duration,” he says. It’s a little like the relationship between shutter-speed and aperture in a camera: by letting lots of light in with a wide aperture and setting a short shutter-speed your photograph will be equally well-exposed as one taken by letting a small amount of light with a narrow aperture and setting a long shutter-speed.
But while we have trouble distinguishing the intensity of flashes of light less than 10ms, we can perceive incredibly quick motion artefacts. “They have to be very specific and special, but you could see an artefact at 500 fps if you wanted to,” DeLong tells me.
The specificity relates to the way that we perceive different types of motion. If you’re sitting still and watching things in front of you moving about, it’s a very different signal to the view you get when you’re walking along. “They centre on different places,” DeLong says. “The middle part of your vision, the foveal region, which is the most detailed, is actually pretty much garbage when it comes to detecting motion, so if you’re watching things in the middle of the screen moving, it’s not that big a deal what the refresh rate is; you can’t possibly see it with that part of your eye.”
But out in the periphery of our eyes we detect motion incredibly well. With a screen filling their peripheral vision that’s updating at 60 Hz or more, many people will report that they have the strong feeling that they’re physically moving. That’s partly why VR headsets, which can operate in the peripheral vision, update so fast (90 Hz).
It’s also worth considering some of the things that we’re doing when we’re playing, say, a first person shooter. We’re continuously controlling the relationship between our mouse movement and the view in a perceptual motor-feedback loop, we’re navigating and moving through 3D space, and we’re also searching for and tracking enemies. We’re therefore continuously updating our understanding of the game’s world with visual information. Busey says that the benefits of smooth, quickly refreshing imagery come in our perception of large-scale motion rather than fine detail.
But how fast can we perceive motion? After everything you've read above, you can probably guess that there are no precise answers. But there are some definitive answers, like this: you can most definitely perceive the difference between 30 Hz and 60 Hz.
What framerates can we really see?
“Certainly 60 Hz is better than 30 Hz, demonstrably better,” Busey says. So that’s one internet claim quashed. And since we can perceive motion at a higher rate than we can a 60 Hz flickering light source, the level should be higher than that, but he won’t stand by a number. “Whether that plateaus at 120 Hz or whether you get an additional boost up to 180 Hz, I just don’t know.”
“I think typically, once you get up above 200 fps it just looks like regular, real-life motion,” DeLong says. But in more regular terms he feels that the drop-off in people being able to detect changes in smoothness in a screen lies at around 90Hz. “Sure, aficionados might be able to tell teeny tiny differences, but for the rest of us it’s like red wine is red wine.”
Chopin looks at the subject very differently. “It’s clear from the literature that you cannot see anything more than 20 Hz,” he tells me. And while I admit I initially snorted into my coffee, his argument soon began to make a lot more sense.
He explains to me that when we’re searching for and categorising elements as targets in a first person shooter, we’re tracking multiple targets, and detecting motion of small objects. “For example, if you take the motion detection of small object, what is the optimal temporal frequency of an object that you can detect?”
And studies have found that the answer is between 7 and 13 Hz. After that, our sensitivity to movement drops significantly. “When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”
Discovered by researcher Rufin vanRullen in 2010, this literally happens in our brains: you can see a steady 13 Hz pulse of activity in an EEG, and it’s further supported by the observation that we can also experience the ‘wagon wheel effect’ you get when you photograph footage of a spinning spoked object. Played back, footage can appear to show the object rotating in the opposite direction. “The brain does the same thing,” says Chopin. “You can see this without a camera. Given all the studies, we’re seeing no difference between 20hz and above. Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that.”
This article is about what framerates the human eye can perceive. The elephant in the room: how fast can we react to what we see? It's an important distinction between games and film worthy of another whole article.
So why can games feel distinctly different at 30 and 60 fps? There's more going on than framerate. Input lag is the amount of time that elapses between inputting a command, that command being interpreted by the game and transmitted to the monitor, and the monitor processing and rendering the image. Too much input lag will make any game feel sluggish, regardless of the LCD's refresh rate.
But a game programmed to run at 60 fps can potentially display your inputs more quickly, because the frames are narrower slices of time (16.6 ms) compared to 30 fps (33.3 ms). Human response time definitely isn't that fast, but our ability to learn and predict can make our responses seem much faster.
The important thing here is that Chopin is talking about the brain acquiring visual information which it can process and on which it can act. He’s not saying that we can’t notice a difference between 20 Hz and 60 Hz footage. “Just because you can see the difference, it doesn’t mean you can be better in the game,” he says. “After 24 Hz you won’t get better, but you may have some phenomenological experience that is different.” There’s a difference, therefore, between effectiveness and experience.
And while Busey and DeLong acknowledged the aesthetic appeal of a smooth framerate, none of them felt that framerate is quite the be-all and end-all of gaming technology that we perhaps do. For Chopin, resolution is far more important. “We are very limited in interpreting difference in time, but we have almost no limits in interpreting difference in space,” he says.
For DeLong, resolution is also important, but only to the small, central region of the eye that cares about it, which comprises only a couple of degrees of your field of view. “Some of the most compelling stuff I’ve seen has been with eye-tracking. Why don’t we do full resolution only for the areas of the eye where we actually need it?” But his real focus is on contrast ratios. “When we see really true blacks and bright whites it’s really compelling,” he says.
What we really know
After all of that, what do we really know? That the brain is complicated, and that there's truly no universal answer that applies to everyone.
- Some people can perceive the flicker in a 50 or 60 Hz light source. Higher refresh rates reduce perceptible flicker.
- We detect motion better at the periphery of our vision.
- The way we perceive the flash of an image is different than how we perceive constant motion.
- Gamers are more likely to have some of the most sensitive, trained eyes when it comes to perceiving changes in imagery.
- Just because we can perceive the difference between framerates doesn't necessarily mean that perception impacts our reaction time.
So it’s not a tidy subject, and on top of all of this, we have to also consider whether our monitors are actually capable of outputting images at these high framerates. Many don’t go above 60 Hz, and Busey questions whether monitors advertised at 120 Hz really display that fast (according to some seriously in-depth testing at TFTCentral, they certainly do). And as someone who has also enjoyed games at the 30 frames per second (and often rather less) rendered by my consoles, I can relate to them suggesting that other aspects of visual displays might connect better with my visual perception.
On the other hand, I would love to hear from pro teams about their objective experiences with framerate and how it affects player performance. Perhaps they’ll corroborate or contradict science’s current thinking in this field. If gamers are so special when it comes to vision, perhaps we should be the ones to spearhead a new understanding of it.
Jen-Hsun Huang might be 'Taylor Swift but for tech', but did you know he was once praised in Sports Illustrated as being 'perhaps the most promising junior ever to play table tennis in the Northwest'?
Some high-stakes poker players are cheating with an earpiece that's 'so small that you can’t take it off with your fingers' and looks like a 'James Bond movie device'