Refresh rate is a Systems Integration term in filmmaking. In virtual production, the nuances of this term take on special meaning. Here, we break down the definition to give you a starting point.
View the full Virtual Production Glossary here »
Basic Definition:
What is Refresh rate in virtual production?
Refresh rate is an important factor in virtual production. It refers to the frequency at which an electronic display is updated and displayed, usually measured in hertz (Hz). The higher the refresh rate, the smoother motion appears on the screen. In virtual productions such as video games or streaming services, having a high refresh rate can make all the difference between a good experience and one that feels off-putting.
Software like AMD's Radeon Software Adrenalin Edition can be used to improve refresh rates by enabling features like Freesync and Enhanced Sync. These features are designed to help reduce input latency and provide better frame synchronization for smoother visuals. However, when looking to maximize refresh rate, it’s essential to consider other factors such as hardware specs, resolution settings, and GPU compatibility.
Common refresh rates for virtual productions range from 60. Hz to 144. Hz. Higher refresh rates typically lead to more fluid visuals but can come at the expense of increased battery drain. To ensure optimal performance with minimal impact on your device’s battery life, you should use the highest refresh rate available while still meeting the recommended specifications for your game or streamer service.
If the refresh rate is not ideal, you may encounter issues like stuttering, lag spikes, tearing, and poor image quality. This can make even simple tasks difficult and ruin what would otherwise be an enjoyable experience. To prevent this, always strive to set up your system with the most suitable refresh rate to get the best possible performance out of your setup.