Contact us! Links Archive Music Hardware Gaming News Digital Silence

detonator.jpg [an error occurred while processing this directive] erazorx.jpg
GeForce Driver Comparison (updated 4-2-00) If you would like to support this site, click on the ad above Elsa Erazor X2 Review

What does "Optimal" Refresh Rate really mean?

Section 1 - The Basics

In the early days of computing, a 60Hz refresh rate was usually the best we could get. It was almost guaranteed, that after prolonged exposure to the monitor, an eyestrain headache was around the corner. Why? Because 60Hz is the frequency of AC electricity (at least here in the US, many other countries use 50Hz power). With the refresh rate set to 60Hz, you will probably notice a strobe or pulsing effect. Over time this can cause eye fatigue.

There are three things that determine what refresh rate your system will support - the video card's RAMDAC, the monitor's capabilities and what resolution you are running. The higher the resolution, the lower the refresh your monitor can support. If you set a refresh rate too high for your monitor, you can damage it. I remember my "old" Lightspeed 128 only had a 135MHz RAMDAC - now cards ship with 350MHz RAMDAC's. There are also typically recommended resolutions for different monitor sizes; i.e. 800x600 for 15" monitors, 1024x768 for 17" monitors and 1280x1024 for 19" monitors. Personally, I use 1152x864 for my 17" monitor. The higher the resolution, the more "desktop real estate" you gain, but the text and icons get smaller.

So how do you find out what refresh rate your system is currently using? Go toStart>Settings>Control Panel. In Control Panel choose Display (Properties), then select the Settings tab. It should look like this

Then select the Advanced button in the lower right corner. From the next screen choose the Adapter tab. It should now look like this