Re: FreeSync support comes to Microsoft's Xbox One
FreeSync beating out GSync is better for everyone. Differences are minimal and 99.9% will never even notice. FreeSync, once adopted on the PS4 as well will no doubt become standard on new TVs and monitors. The expensive module is unnecessary and overpriced for almost everyone. If almost every monitor adopts it, Nvidia will be forced to support it. GSync may be kept as a "master race" feature but I think they'll struggle to sell it afterwards.
Re: FreeSync support comes to Microsoft's Xbox One
I'm currently (for the first time since the Gamecube!) considering buying a console due to the ridiculous prices on graphics cards and RAM. If the Xbox One X can make a good fist of 4K and can do Freesync for under £400 (I've seen as cheap as £350 on HUKD) that makes it cheaper than a 1070/Vega 56 alone. I can't believe I'm even considering it!! But the cost difference is insane. Having a feature like Freesync on a console makes it even harder for me to justify paying £1000+ to build a new small form factor gaming PC, when my laptop is powerful enough to do all the general computing tasks I need.
Re: FreeSync support comes to Microsoft's Xbox One
Quote:
Originally Posted by
CAPTAIN_ALLCAPS
The issue is not how the image is drawn, the issue is why is the image being drawn when it is unchanged/the graphics card is not ready to output a new image.
On CRTs and other phosphor-based screens this was necessary as each pixel fades after being drawn. Hence if the image is not refreshed regularly it disappears.
An LCD has no such limitation so I do not see why it must have any refresh rate at all - why it cannot sit idle awaiting input after drawing a frame, until the graphics card is ready to output a new one.
LCD still has to refresh itself, the image still has to be pulled out of a buffer and the screen refreshed. It has to be synced so you don't see the changes and get artifacts. Many tv's for example will have processing to stop older source material from looking poor as the frames can get refreshed mid viewpoint or to provide frames in between the source frames to make them appear smoother and better. Throw in 3D if you still want/desire it and then you have 2 separate frames to display and sync. This is why refresh rates are still about and to be honest will be in some shape or form for a long time. Imagine if your new 4k tv couldn't display any content created before say 2015 unless it had been converted...
Re: FreeSync support comes to Microsoft's Xbox One
Quote:
Originally Posted by
Spud1
You've just answered your own question :) The biggest limition of freesync has always been it's refresh range - which becomes a big issue when you are looking at higher end panels (1440p and 4k in general) where your FPS is very likely to be hugely variable and to drop below 60. I agree that they are pretty much at feature parity *at the moment* and you can indeed find high quality IPS panel FS montiors that support the full ~25-240hz rangebut FS has lagged behind for a long time, it only recently that the tech has caught up.
Don't get that. I have had this 1440p 40-144Hz monitor for over 2 years which is a lifetime in tech terms. The specs including frequency range were the same as the equivalent GSync monitor (yes there was one) as that is a limitation of the TN panel not the variable refresh tech, just mine cost £450 rather than £550. Of the two I voted with my wallet for FreeSync largely because I just don't see how GSync can survive. As a bonus my video card refresh was cheaper and according to benchmarks has lasted better than a GTX960 would have which worked out nicely.
I can see how cheap monitors with limited range could give FreeSync an image problem (no pun intended), but frankly if you pay under £100 for a monitor you shouldn't expect much and are likely on a very low end graphics card. My main monitor and my wife's monitor were specced for gaming and are 35 to 144Hz FreeSync monitors. My daughter's monitor was specced for decent colour reproduction, she went for the Samsung quantum dot display (for some reason, I thought she would have had the Dell) which for about £240 just happens to have FreeSync up to 144Hz.
And that's the kicker, I am starting to see panels that just happen to be FreeSync, and in this case very capable of gaming. As the old non-Freesync panel control ASIC chips go end of life I expect all monitors will end up FreeSync compatible simply by switching to the latest ASICs.
So AFAICS with GSync you are paying for a product quality guarantee that comes out of Nvidia's certification process rather than any technical lead. That's personally not worth the best part of £100, I'll just read a couple of reviews thanks.
Re: FreeSync support comes to Microsoft's Xbox One
Quote:
Originally Posted by
CAPTAIN_ALLCAPS
The issue is not how the image is drawn, the issue is why is the image being drawn when it is unchanged/the graphics card is not ready to output a new image.
On CRTs and other phosphor-based screens this was necessary as each pixel fades after being drawn. Hence if the image is not refreshed regularly it disappears.
An LCD has no such limitation so I do not see why it must have any refresh rate at all - why it cannot sit idle awaiting input after drawing a frame, until the graphics card is ready to output a new one.
Ah i get what your saying, sorry for the mistake, what you seem to be describing is the difference between passive and active matrices, each type comes with advantage and disadvantages, AFAIK passive is like you describe, send data (image), LCD flips the pixels and they remain flipped until new data is sent, à la ebook reader, some laptops, phones etc. The problem is it costs more to make a passive LCD display, it also takes longer to refresh the display when you need to update it (bad response times), and (afaik) the contrast ratio can suffer.
Active matrix LCD's are cheaper to make, quicker to refresh the display when the image needs to be changed and have better contrast ratios, obviously those advantages come with increased power draw (plugged into a mains supply that doesn't matter) and having to keep sending electrical signals to each pixel so it doesn't revert to a standing state.
At least that's my understanding of the differences. :undecided