We perceive the world at a certain rate, which some people refer to as frames per second (FPS). Experts say that our eyes can only see up to 60 frames per second, so it raises the question: why do video game developers increasingly brag about high definition and high FPS?
Similarly to the idea of FPS, many monitors and screens advertise high refresh rate that come in a unit of measurement known as Hertz. Like FPS, the refresh rate is how many times the screen changes images per second. 1 FPS is roughly equivalent to 1Hz. The higher the refresh rate in a screen, the steadier stream of images our eyes see, and the less ‘flickering’ we see. If we think of how old films used to be, we can better understand this idea of flickering. Old films, for example, were typically filmed at only at 16 FPS but played back up to 24. This is why old films sometimes seem to be 'sped up'-- they simply are. Today, big screen movies are shot at only around 24 FPS, which give a distinct but barely recognizable different in sports broadcasting and soap operas.
Looking at clips of old films, we see a ‘flicker’ consistently throughout the movie. This has to do with the very low FPS and refresh rate of old films. However, in today age that refresh rate and FPS is so high, it is virtually unnoticeable. On the other hand, that still poses the question: is there a cap on the refresh rate and FPS perceivable by the human eye?
Our eyes work fast, and experts think how fast is actually a lot higher than previously predicted. First, we need to consider how quickly the eye can process an image. Back when experts said our eyes can only see in about 30-60 FPS, it was believed that our eyes could only perceive an image we saw for a minimum of 100 milliseconds, or .1 second. Meaning we had to see an image for a minimum of .1 second for it to be perceived by our brains. However, a study done in 2014 by MIT actually found that our brains only need 13 milliseconds, .013 seconds, to perceive an image. That number more closely translates to 75 FPS. But this doesn’t explain why games and monitors increasingly run at higher FPS and refresh rates.
Though we aren’t completely sure, we may offer a simple explanation as to why a 240Hz monitor may seem smoother than a 144Hz monitor. Even if our individual cells can only perceive images at say 75 FPS, the goal of films and games is to give an illusion of motion. Creating a series of still images into what we perceive as movement and motion is a lot more complicated than putting a bunch of still images together. Game and film makers must not only capture each image of movement but capture moments in between to create a ‘blur’. A study in 2010 by Rufin vanRullen revealed that the minimum refresh rate for us to detect motion is 13Hz, but we still don’t have a concrete answer on what the maximum is. What we do know it that most people cannot tell the difference between 144Hz and 240Hz. In fact, researchers believe that a steep drop off in perception of higher frame rates begins as low as 90Hz. Typically around 200Hz, though, ‘images’ appear simply as real life motion.
So ultimately, what FPS can we see at?
The simple answer is: we don’t know yet. Even though it’s thought that our eyes can only see up to 75FPS, there seems to be some difference in high FPS and refresh rates. Some people have trained their eyes to notice the ‘flicker’ of lower refresh rates, like film makes and professional gamers. Ultimately, we may debunk the myth that there is simply no difference in monitors over 60Hz and games over 60 FPS.
https://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/
https://medium.com/cogly/how-many-frames-per-second-can-the-human-eye-really-see-bd100b410a04