Welcome to ‘Kagoo Explains’ - a series of short articles explaining some of the confusing terminology used to describe technology. This week we’re looking at refresh rates on televisions and monitors, and explaining how they influence the ‘smoothness’ of your video game.
At its most basic, refresh rate is a measurement of how often a display (i.e a television or computer screen) updates what is shown on the screen in a single second. This is measured in hertz (Hz) - so if a monitor has a refresh rate of 60Hz, it will refresh the image 60 times in a single second. This means it will draw twice as many images on the screen as a 30Hz monitor, but only half as much as a 120Hz monitor.
Why is this important? Televisions and screens don’t show actual video: it is more like a very, very fast succession of still images, changing so quickly that the human eye can’t notice the changes - so it appears as a single unbroken moving image. This is a phenomenon that goes all the way back to the Zoetrope, first created in the 1830s.
The crucial thing is that the slower the ‘speed’ (in our case, refresh rate) of the refresh, the easier the eye finds it to perceive the changes between images. Therefore as the refresh rate slows, the video starts to become more jerky and less lifelike. This can also introduce ‘flickering’ in old or low-quality monitors - this is a visual effect caused when the refresh rate drops so low that the human eye can start to notice the changes between images.
In most cases therefore, a higher refresh is desirable. This is especially true when playing video games - low refresh rate will lead to the action on screen seeming ‘choppy’, and therefore can be distracting and difficult to control. Conversely, a high refresh rate will make the games seem smoother and more lifelike, and will help increase reaction time as well.
This Youtube video
is an excellent demonstration of the difference between refresh rates when gaming. It really makes clear how much jerkier the motion is at lower refresh rates, and therefore why video games are far more satisfying at high refresh rates.
If you’re using a television purely for film or TV, the refresh rate is slightly less important (since you don’t have to make any split-second reactions), but you will still see a big improvement during very fast-moving action scenes - such as the ‘god what is even happening’ fight scenes in any Michael Bay Transformers film.
Finally, while on the subject of films - there is an important difference to be noted between ‘refresh rate’ and ‘frame rate’ (measured in fps - frames per second). While the refresh rate is a hardware measurement, the frame rate is set by whatever media is being displayed. So if you are watching a movie that is 24 fps (the classic frame rate for movies), on a 240Hz television, the images on the screen will refresh 240 times a second, but the screen will only receive a new frame from the film 24 times a second. This means that the screen will refresh an identical image 10 times before it receives a new frame - this means that improving the refresh rate won’t make a large difference unless the media also has a higher frame rate.
This is part of the reason refresh rate is so important in gaming - while films are stuck on whatever fps they are filmed at, the frame rate in video games can be varied on the fly, so you can push it as high as you want.
If all of this talk of refresh rates has got you in the mood to game, or watch things explode in slow motion, then we’ve got you covered with the Best Televisions and Best PC Monitors.