Read more.And the Xbox One S and X will get FreeSync 2 with HDR in the latest Insiders update.
Read more.And the Xbox One S and X will get FreeSync 2 with HDR in the latest Insiders update.
That's my point, but freesync isn't AMD specific, it's an open standard. The PS4 has similar hardware so may well start supporting it soon too, many monitors included it now (as it's royalty free,) and a lot of 2018 TVs are starting to as well. If that continues G-Sync will die regardless of what Nvidia does with it if nobody makes compatible displays.
What I find odd is why we still have refresh rates. There aren't any phosphor-based screens that I know of anymore, so the need to regularly redraw the screen to keep the image present has disappeared.
Why hasn't the standard been amended yet so that the frame is drawn only once the graphics card is ready? This would probably save power when the image is unchanged too - no need to redraw the same image 60+ times per second.
Because LCD's still have a refresh rate, that is to say it can only redraw the image (flip the pixels) at a certain rate and that is normally done in a progressive manner, i.e top 2 bottom. It's why you can still get tearing on an LCD as part of the display is has not updated at the same time as another part. (at least that's my understanding, happy to be corrected if wrong)
Correct, and it takes time to pull the frame from the buffer and stuff it down the cable. Unless we get cables with 24 million wires (one per colour channel per pixel for 4k) there will always be some sequentiality to the update (though there are some mechanisms now to just send deltas rather than the whole frame).
Corky34 (12-03-2018)
I remember a time, long ago, when games consoles and their games had to be complete and perform out of the box as advertised without any help from a specially bought screen to stop it stuttering. No half finished games and incremental, performance changing updates to hardware either. As the years go by these machines are literally becoming lightweight gaming PC's. I am finding it harder and harder to justify buying a console, I mean some of the exclusive titles are literally the only thing keeping me slightly interested anymore. Price doesn't even come into it either, I would have a PC at home either way, so I'd rather buy a more expensive GPU than a console any day of the week at this point.
What happened to the pros of buying a console being "it just works and costs a lot less".
The games console, fast becoming "a computer for the smartphone generation" in my eyes.
Sadly this is probably the case It seems that when it comes to home A/V tech and there are two competiting standards..odds are good that the inferior one will win out. VHS won over betamax, Bluray over HDDVD, MP3 over OGG..and now Freesync over Gsync.
Shame really, not that it affects me personally as I barely use my xbox for gaming anymore and have an nvidia gpu & gsync display for my PC.
True - and in this case (as with the others I guess) Freesync is "good enough" for most people, especially as AMD are finally near enough at feature partity with G-sync now that Freesync 2 is on the horizon. You have to be into HDR high gaming on very high refresh rate (144mhz or higher) 1400p/4k monitors before most people would notice a difference between G-Sync & Freesync 1.
The issue is not how the image is drawn, the issue is why is the image being drawn when it is unchanged/the graphics card is not ready to output a new image.
On CRTs and other phosphor-based screens this was necessary as each pixel fades after being drawn. Hence if the image is not refreshed regularly it disappears.
An LCD has no such limitation so I do not see why it must have any refresh rate at all - why it cannot sit idle awaiting input after drawing a frame, until the graphics card is ready to output a new one.
What are are describing has been done for years in laptops to save power, and that technique is the basis for both FreeSync and GSync.
AIUI an LCD does have some fade if not refreshed, setting a pixel is done by aligning the liquid crystals just right and as they aren't solid they will move if not re-aligned. I've not seen a description on how OLED pixels are wired up, but I'm guessing they will have to be on a matrix to make the wiring manageable and so presumably will stop glowing pretty fast after being driven. Panels seem to have a minimum update of about 30 to 35Hz, I'm guessing it gets messy looking below that else they would allow the panels to go lower.
Edit: Note that DisplayPort isn't a conventional display connection, it works more like a packet network, so it isn't tied down to CRT emulation like HDMI or VGA.
In what way is FreeSync inferior? Nvidia's insistence on the expensive dongle hardware being built into monitors to enforce their license fee adds to panel latency and cost, other than that they seem to be tracking each other in features pretty well, like they both added HDR support at about the same time.
The only downside with FreeSync is that you get no guarantee on refresh range and hence low framerate compensation might not work on a cheap panel. The upside is that cheap panels exist, no £90 GSync panel will do that because you don't have the option of a budget GSync panel.
Last edited by DanceswithUnix; 13-03-2018 at 08:39 AM.
You've just answered your own question The biggest limition of freesync has always been it's refresh range - which becomes a big issue when you are looking at higher end panels (1440p and 4k in general) where your FPS is very likely to be hugely variable and to drop below 60. I agree that they are pretty much at feature parity *at the moment* and you can indeed find high quality IPS panel FS montiors that support the full ~25-240hz rangebut FS has lagged behind for a long time, it only recently that the tech has caught up.
There are currently 1 users browsing this thread. (0 members and 1 guests)