This is something I've been pondering for some weeks now.
Is it really worthwhile disabling the Vsync in graphics card profiles ?
With Vsync enabled Windows XP locks the refresh rate at 60 Hz and therefore the framerate doesnt go beyond that, however with it disbled I can hit over 200 FPS in benchmarks like 3DMark 2001 SE with my 6600GT.
It is known that the human eye can see around 15 frames per second, most movies are at around this, however with this digital era 15 frames per second doesnt cut it as it goes all juddery.
Now take into consideration what refresh rate the monitor can support, my Hansol 15" CRT monitor can do a maximum of 85 Hz so anything above this and you get the infamous tearing.
Tearing is caused by and image being moved from the sub buffer on the VGA card into the main buffer, however, if the frames are higher than the actual monitor refresh rate the screen cant update fast enough and you get remnants of the previous image onscreen with part of the new image, now if you are stood still this is all well and good but if you are moving the image you see tears.
I know if you are benching youre PC you want as high a framerate as possible to see just what it can do and get as high a score as possible. In games though is there any need for frames being above 60 ? If you get around a steady 35 - 40 FPS the game will run smooth.
So instead of disabling the Vsync and get frames being rendered that you will never see is it worthwhile enabling it to lock the framerate at 60 so the GPU can use all the wasted frames for something else to keep the game running at a steady smooth framerate.
Not forgetting also, pushing monitors past their highest refresh rate setting long term could end up damaging them and it also means that the GPU doesnt have as much of a workload to do which in turn will give it a longer lifespan and keep temperatures down.
Just something I was wondering, theres my arguments for this. Whats everyone elses take on this ?