Freesync with LFC is 35-144Hz
https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/
Shame these guys found little to no issues.
Yet again Nvidia lies and wants you to shell out for expensive proprietary crap.
Actually it is any monitor where the highest frame rate is at least 2.5x higher than the lowest frame rate. So if the lowest frame rate is 35, then you only need 88fps to get LFC (which is why it's a shame so many panels top out at 75Hz).
Edit after doing my own paranoia fact check : https://www.amd.com/Documents/freesync-lfc.pdf
Well it is expensive and proprietary. Now if it was proprietary and inexpensive, who would be complaining about it adding £10-£20 to the price of a monitor. It honestly can't cost that much to produce the module used in volume, and the testing can be automated and added into the QA testing lines. So that just leaves the charge from Nvidia for the privilege of adding it to monitors. Clearly it isn't that big of a selling point otherwise we wouldn't be seeing the market tilt more heavily towards Freesync.
20% price difference is quite large, if it was 5%? Not so much.
Depends on what you consider "expensive".
What experience do you have of the pricing of similar modules and of how much it costs NVIDIA to produce the module or is that pure speculation?
How do you suppose they automate testing procedures? Which of their tests could be automated and how would that result in a better product? I'd rather NVIDIA actually tested the monitors properly rather than done it as cheap as possible.
(\__/)
(='.'=)
(")_(")
Been helped or just 'Like' a post? Use the Thanks button!
My broadband speed - 750 Meganibbles/minute
It does at the volumes and using the techniques that Nvidia used. What they would need to do to get the price down and volume up is to get it integrated into the control ASIC of the monitor as that simplifies the design and the interfaces are already there so you don't end up with duplicated in/out DP ports and duplicated buffer memory and a big FPGA (those things really aren't cheap). One fewer hops in the chain potentially makes for a lower latency display as well. That's what Freesync does, and why ultimately I expect all Gsync panels will underneath be a standard controller with Freesync but with a Gsync board plugged in front. Because why would anyone design a modern monitor ASIC without freesync ability?
This late in the day if feels like that is only the case because Nvidia had no choice. Intel are about to release Freesync compatible CPUs, so you could plug your Freesync monitor into an Intel IGP port on the motherboard and tell the drivers to use your Nvidia card through that port. It was done with Nvidia cards working though AMD cards before but that's a bit odd and fringe, but how many people with Nvidia cards also have an i5 or i7 with IGP?
It would have been really really nice to feel that Nvidia did something for the benefit of the customers. I'm not feeling that here. Still, the world is a better place for the feature.
I think the biggest driver for nvidia's change of heart is hdmi 2.1. Soon all tellies are going to get freesync 2 and they knew their market share would diminish. SO they made a last ditch effort to throw shade on amd's tech. It's only fanboys that are falling for it.
I'll stick with Fast or Adaptive sync for my 1070 running my 4k monitor (FreeSync). All the way down to around 34fps and it's nearly butter smooth with no screen tearing. Anything higher than 40fps is fantastic and as smooth as crisco mixed with butter mixed with olive oil.
Can't justify the (ridiculous!) cost for G-Sync when features built in to the current nVidia drivers make the hardware/technology obsolete.
There are currently 1 users browsing this thread. (0 members and 1 guests)