Results 1 to 15 of 15

Thread: To Vsync or not ?

  1. #1
    Going Retro!!! Ferral's Avatar
    Join Date
    Jul 2003
    Location
    North East
    Posts
    7,860
    Thanks
    561
    Thanked
    1,438 times in 876 posts
    • Ferral's system
      • Motherboard:
      • ASUS Z97-P
      • CPU:
      • Intel i7 4790K Haswell
      • Memory:
      • 12Gb Corsair XMS3 DDR3 1600 Mhz
      • Storage:
      • 120Gb Kingston SSD & 2 Tb Toshiba
      • Graphics card(s):
      • Sapphire Radeon R9 380 Nitro 4Gb
      • PSU:
      • Antec Truepower 750 Watt Modular
      • Case:
      • Fractal Design Focus G Mid Tower
      • Operating System:
      • Windows 10 64 bit
      • Monitor(s):
      • 28" iiyama Prolite 4K
      • Internet:
      • 80Mb BT Fiber

    Question To Vsync or not ?

    This is something I've been pondering for some weeks now.

    Is it really worthwhile disabling the Vsync in graphics card profiles ?

    With Vsync enabled Windows XP locks the refresh rate at 60 Hz and therefore the framerate doesnt go beyond that, however with it disbled I can hit over 200 FPS in benchmarks like 3DMark 2001 SE with my 6600GT.

    It is known that the human eye can see around 15 frames per second, most movies are at around this, however with this digital era 15 frames per second doesnt cut it as it goes all juddery.

    Now take into consideration what refresh rate the monitor can support, my Hansol 15" CRT monitor can do a maximum of 85 Hz so anything above this and you get the infamous tearing.

    Tearing is caused by and image being moved from the sub buffer on the VGA card into the main buffer, however, if the frames are higher than the actual monitor refresh rate the screen cant update fast enough and you get remnants of the previous image onscreen with part of the new image, now if you are stood still this is all well and good but if you are moving the image you see tears.

    I know if you are benching youre PC you want as high a framerate as possible to see just what it can do and get as high a score as possible. In games though is there any need for frames being above 60 ? If you get around a steady 35 - 40 FPS the game will run smooth.

    So instead of disabling the Vsync and get frames being rendered that you will never see is it worthwhile enabling it to lock the framerate at 60 so the GPU can use all the wasted frames for something else to keep the game running at a steady smooth framerate.

    Not forgetting also, pushing monitors past their highest refresh rate setting long term could end up damaging them and it also means that the GPU doesnt have as much of a workload to do which in turn will give it a longer lifespan and keep temperatures down.

    Just something I was wondering, theres my arguments for this. Whats everyone elses take on this ?

  2. #2
    Senior Member Kezzer's Avatar
    Join Date
    Sep 2003
    Posts
    4,863
    Thanks
    12
    Thanked
    5 times in 5 posts
    Firstly, benchmarks are just a number. I do enable Vsync all the time as there's no point in having more than 60fps imo. It's down to personal preference though. I think you can notice the difference between 60 and 100, as I could.

    The human eye can only see 15fps ? I swear it's more than that, I can easily tell the difference between 15 and 40 fps

  3. #3
    Network|Geek kidzer's Avatar
    Join Date
    Jul 2005
    Location
    Aberdeenshire
    Posts
    1,732
    Thanks
    91
    Thanked
    46 times in 41 posts
    • kidzer's system
      • Motherboard:
      • $motherboard
      • CPU:
      • Intel Q6600
      • Memory:
      • 4GB
      • Storage:
      • 1TiB Samsung
      • Graphics card(s):
      • BFG 8800GTS OC
      • PSU:
      • Antec Truepower
      • Case:
      • Antec P160
      • Operating System:
      • Windows 7
      • Monitor(s):
      • 20" Viewsonic
      • Internet:
      • ~3Mbps ADSL (TalkTalk Business)
    We were told in Physics that Persistance of Vision* requires around 22-25fps

    *when what the human eye sees appears smooth
    "If you're not on the edge, you're taking up too much room!"
    - me, 2005

  4. #4
    Almost in control. autopilot's Avatar
    Join Date
    Dec 2004
    Location
    Region 2
    Posts
    4,071
    Thanks
    51
    Thanked
    12 times in 11 posts
    Yeah, around 25 FPS, although gaming i would say more like 30-35. But what is the real benit of Vsync anyway?

  5. #5
    Going Retro!!! Ferral's Avatar
    Join Date
    Jul 2003
    Location
    North East
    Posts
    7,860
    Thanks
    561
    Thanked
    1,438 times in 876 posts
    • Ferral's system
      • Motherboard:
      • ASUS Z97-P
      • CPU:
      • Intel i7 4790K Haswell
      • Memory:
      • 12Gb Corsair XMS3 DDR3 1600 Mhz
      • Storage:
      • 120Gb Kingston SSD & 2 Tb Toshiba
      • Graphics card(s):
      • Sapphire Radeon R9 380 Nitro 4Gb
      • PSU:
      • Antec Truepower 750 Watt Modular
      • Case:
      • Fractal Design Focus G Mid Tower
      • Operating System:
      • Windows 10 64 bit
      • Monitor(s):
      • 28" iiyama Prolite 4K
      • Internet:
      • 80Mb BT Fiber
    Vsync on graphics profiles locks the framerate at youre monitors refresh rate (or the standard 60 Hz Windows does) so in benchmarks you wouldnt score as high.

    However when it is off framerate can go high but you wouldnt see all the frames and you get the tearing. Not forgetting possible damage to the monitor pushing it past its refresh rate to framerate ratio.

  6. #6
    Senior Member
    Join Date
    Feb 2004
    Posts
    888
    Thanks
    0
    Thanked
    32 times in 29 posts
    I have both Vsync & AA on even if the FPS is less than ideal because edge crawl and tearing both suck.

  7. #7
    Prize winning member. rajagra's Avatar
    Join Date
    Oct 2004
    Posts
    1,023
    Thanks
    0
    Thanked
    0 times in 0 posts
    Disabling Vsync will not damage a monitor. It just means the image is changed part way through drawing a frame, meaning the top part and bottom part will be from different snapshots in time. The crossover point will be at a different height for different frames, possibly causing a nasty rolling/strobing effect.

    This will happen whenever the screen refresh rate is not an exact multiple of the video card refresh rate. So if you enable Vsync and your screen runs at 60Hz, the Video card might run at 60/30/20/15/12/10Hz etc! Whatever your system thinks it can cope with. (At least that's how the game World of Warcraft works, it is good enough to tell you so.)

    With Vsync disabled, a slow video card framerate can cause the same tearing as a too-fast one can. e.g. 60Hz screen and 50Hz card framerate = a problem.
    DFI LanParty UT NF4 SLI-D; AMD64 3500+ Winchester ;
    2x XFX 6600GT ; Corsair XMS3200XLPRO TWINX 1GB;
    Dell 2405FPW TFT.

  8. #8
    Senior Member
    Join Date
    May 2004
    Location
    Rochester, NY
    Posts
    1,041
    Thanks
    4
    Thanked
    8 times in 8 posts
    • oralpain's system
      • Motherboard:
      • DFI "Blood Iron" P35-T2RL
      • CPU:
      • Intel Pentium E2140 @ 400x8 (3.2GHz), 1.375v
      • Memory:
      • Crucial Ballistix DDR2 800 CL4 @ 500MHz (DDR 1000), 4-4-4-12-T2, 2.3v
      • Storage:
      • 2x Seagate ST3250410AS
      • Graphics card(s):
      • NVIDIA 8800GTS (G92) 512 @ 783MHz core, 1836MHz shader, 1053Mhz memory, stock cooling 70% fan speed
      • PSU:
      • Seasonic SS-500GB
      • Case:
      • Antec P182, with some small modifications
      • Monitor(s):
      • ASUS VW222U
      • Internet:
      • Time Warner "Road Runner" Cable - 16 megabit downstream, 1 megabit upstream
    Tearing won't damage a monitor.

    Most people will see a mostly smooth image past 25-30 fps or so, but they can certainly see things that are shown for far shorter times than 1/30th of a second. You can still see something that only takes 1/250th of a second to happen.

    I can easily see the difference bettween 60 and 100 fps, or 100 and 200 fps. I can look at a screen and tell you what refresh rate it's on. I can see the difference between PAL and NTSC. I can see and dodge grenades in games (BF1942 for instance) that are only in view for a single frame, at 100+ frames persecond.

    I leave vsych disabled in most newer D3D games. Wildly shifting from 100 to 50 FPS (as I normally use 100Hz) is far more annoying than tearing.

    With OpenGL games, I just force tripple buffering on through the video drivers. This allows vsynch at any FPS equal too or lower than the refresh rate. Tripple buffering is great. Too bad it has to be supported by specific games to work with D3D and most don't support it.
    Last edited by oralpain; 25-09-2005 at 08:07 AM.

  9. #9
    Senior Member
    Join Date
    Feb 2004
    Posts
    888
    Thanks
    0
    Thanked
    32 times in 29 posts
    IL2 is OpenGL and I just tried forcing tripple buffering with the vsync off. It did not reduce cockpit tearing when looking around.

  10. #10
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    Vertical sync makes for a much more enjoyable experience for me, I hate tearing with a passion.
    If a level of detail is set too high to provide a smooth gameplay then it needs reducing (or upgrade the graphics card/CPU ).

    Unless the architecture of graphics memory has changed since I used to program in assembler, it is addressed exactly the same as regular memory and you dictate where the visible screen starts - this allows you to either update the current screen memory contents directly (inside the vertical blank period, traditionally where the CRT raster moves from the bottom-right back to the top-left corner) or use "buffering".
    The problem is that the vertical blank period is very, very short so you have a lot to do in a very short space of time.

    I don't quite understand the benefit of triple-buffering over double-buffering - buffering is the action of rendering the next frame in another part of memory and then updating the pointer to visible memory when the entire screen is ready.
    If you do this update within the vertical blank then you never get tearing or partial updates - the principle of double-buffering is that you update screen 1 when displaying screen 2, then switch to make screen 2 visible and redraw screen 1, rinse & repeat.
    As screen updates need to be realtime, drawn by dynamic events, you can't "pre-render" too many screens in FPS games - hence why I don't understand the need for triple-buffering.

    Too much eye candy + vertical sync off = tearing or partially missing updates.
    Too much eye candy + vertical sync on = slideshow, but each screen is displayed as it should be.

    It was a real challenge programming realtime 3D textured graphics on a 286 12MHz with an ISA 320x200 256-colour graphics card back in the day


    Edit:
    ed^chigliak - you won't get any benefit from buffering unless you also use vertical sync, as rather than drawing to the current screen you are changing the screen which is being displayed part-way through a current display - so you still get tearing.
    Buffering without waiting for the vertical blank is pointless, and a waste of memory.
    Last edited by Paul Adams; 25-09-2005 at 11:02 AM.
    ~ I have CDO. It's like OCD except the letters are in alphabetical order, as they should be. ~
    PC: Win10 x64 | Asus Maximus VIII | Core i7-6700K | 16GB DDR3 | 2x250GB SSD | 500GB SSD | 2TB SATA-300 | GeForce GTX1080
    Camera: Canon 60D | Sigma 10-20/4.0-5.6 | Canon 100/2.8 | Tamron 18-270/3.5-6.3

  11. #11
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    Quote Originally Posted by Ferral
    Not forgetting also, pushing monitors past their highest refresh rate setting long term could end up damaging them and it also means that the GPU doesnt have as much of a workload to do which in turn will give it a longer lifespan and keep temperatures down.
    This is true (some people appear to have misinterpreted this as "disabling vertical sync" or "tearing" can damage a monitor).
    However, this should be outside of the control of an application - the drivers for the graphics card and the monitor should dictate the limitations & available resolutions/frequencies supported.

    With CRTs the computer has complete control over the signal it sends to the monitor - you dictate the refresh rate, horiztonal line count, etc. and there is nothing to stop a request for a 6,900x4,234 @ 456Hz display which could really upset (or damage) older monitors.

    Newer monitors tend to trap requests for silly frequencies & resolutions which is where you get "input frequency out of range" on some monitors when booting up or playing with beta graphics drivers, for example.
    ~ I have CDO. It's like OCD except the letters are in alphabetical order, as they should be. ~
    PC: Win10 x64 | Asus Maximus VIII | Core i7-6700K | 16GB DDR3 | 2x250GB SSD | 500GB SSD | 2TB SATA-300 | GeForce GTX1080
    Camera: Canon 60D | Sigma 10-20/4.0-5.6 | Canon 100/2.8 | Tamron 18-270/3.5-6.3

  12. #12
    Slightly Trigger Happy
    Join Date
    Jun 2005
    Location
    In front of a computer
    Posts
    366
    Thanks
    0
    Thanked
    0 times in 0 posts
    i have never had tearing (well anything thats obvious) when running with vsync off. And 60fps just isn't right when you can have 150fps at full detail. you may as well have everything your graphic card and processor can offer.
    your computer is similar to a fridge in that if it cannot keep a beer cold then it sucks

  13. #13
    Prize winning member. rajagra's Avatar
    Join Date
    Oct 2004
    Posts
    1,023
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by Paul Adams
    some people appear to have misinterpreted this
    In the context of the original post, it did imply that high framerates can push monitors past their highest refresh rate setting. It is worth clarifying that this is not the case.

    A graphics card pumping out 200fps would be perfectly safe on a screen running at 50Hz refresh rate with the correct signalling. It just means that 3 out of 4 frames will never be seen (or only 1/4 of each frame will ever be seen.) This is very wasteful of CPU & GPU processing power, but not harmful (apart from excess heat, as already pointed out by OP.)
    Last edited by rajagra; 25-09-2005 at 12:26 PM.
    DFI LanParty UT NF4 SLI-D; AMD64 3500+ Winchester ;
    2x XFX 6600GT ; Corsair XMS3200XLPRO TWINX 1GB;
    Dell 2405FPW TFT.

  14. #14
    Going Retro!!! Ferral's Avatar
    Join Date
    Jul 2003
    Location
    North East
    Posts
    7,860
    Thanks
    561
    Thanked
    1,438 times in 876 posts
    • Ferral's system
      • Motherboard:
      • ASUS Z97-P
      • CPU:
      • Intel i7 4790K Haswell
      • Memory:
      • 12Gb Corsair XMS3 DDR3 1600 Mhz
      • Storage:
      • 120Gb Kingston SSD & 2 Tb Toshiba
      • Graphics card(s):
      • Sapphire Radeon R9 380 Nitro 4Gb
      • PSU:
      • Antec Truepower 750 Watt Modular
      • Case:
      • Fractal Design Focus G Mid Tower
      • Operating System:
      • Windows 10 64 bit
      • Monitor(s):
      • 28" iiyama Prolite 4K
      • Internet:
      • 80Mb BT Fiber
    Quote Originally Posted by rajagra
    It just means that 3 out of 4 frames will never be seen (or only 1/4 of each frame will ever be seen.) This is very wasteful of CPU & GPU processing power, but not harmful (apart from excess heat, as already pointed out by OP.)
    Right so my take on that one (way I read it basically). If Vsync is enabled and you get back what is currently being wasted it means that you are....

    A : Bringing down GPU temperatures (long term prolonging parts life as it wont be working flatout full load).

    B : Giving the GPU a bit more power to do other tasks as its framerate is being held back by the Vsync.

    C : In theory with the Vsync enabled and with the little extra power the GPU should now have you may well be able to actually up some of the detail settings or knock the resolution up by another notch, which if its the resolution you up, in theory you should be able to knock the AA down a level as you will be getting a sharper image from the higher resolution.

    Sorry if I confused people in the way I worded it with regards to tearing, was just trying to get my head around the whole Vsync thing the past couple of weeks as its quite an interesting subject. I personally hate tearing in games, looks awful.

    Its getting really interesting this I have to say, enjoying the other forum members take on this subject. Replys are cool as they are easily read and understood
    Last edited by Ferral; 25-09-2005 at 07:18 PM.

  15. #15
    Senior Member
    Join Date
    May 2004
    Location
    Rochester, NY
    Posts
    1,041
    Thanks
    4
    Thanked
    8 times in 8 posts
    • oralpain's system
      • Motherboard:
      • DFI "Blood Iron" P35-T2RL
      • CPU:
      • Intel Pentium E2140 @ 400x8 (3.2GHz), 1.375v
      • Memory:
      • Crucial Ballistix DDR2 800 CL4 @ 500MHz (DDR 1000), 4-4-4-12-T2, 2.3v
      • Storage:
      • 2x Seagate ST3250410AS
      • Graphics card(s):
      • NVIDIA 8800GTS (G92) 512 @ 783MHz core, 1836MHz shader, 1053Mhz memory, stock cooling 70% fan speed
      • PSU:
      • Seasonic SS-500GB
      • Case:
      • Antec P182, with some small modifications
      • Monitor(s):
      • ASUS VW222U
      • Internet:
      • Time Warner "Road Runner" Cable - 16 megabit downstream, 1 megabit upstream
    Quote Originally Posted by ed^chigliak
    IL2 is OpenGL and I just tried forcing tripple buffering with the vsync off. It did not reduce cockpit tearing when looking around.
    Tripple buffering only helps with vsync on. Vsync fixes the tearring, tripple buffering smoothes out the frame rate.

    Tripple buffering with vsync off just wastes video memory.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Vsync in MOHAA with 6800GT
    By Eddy396 in forum Graphics Cards
    Replies: 1
    Last Post: 18-10-2004, 02:48 PM
  2. best settings in rad 9700 settings
    By Anddos in forum Graphics Cards
    Replies: 23
    Last Post: 08-08-2003, 06:32 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •