Read more.And AMD rebrands FreeSync 2 as FreeSync 2 HDR - a more apt, descriptive name.
Read more.And AMD rebrands FreeSync 2 as FreeSync 2 HDR - a more apt, descriptive name.
Well the 'HDR' component of it is complete garbage TBH.
400 nit 'HDR' is the 30fps of HDR rendering..It'll do the job but you'll hate it after seeing a full fat 1000 nit display.
philehidiot (28-02-2019)
I have an HDR400 screen and I havent used an HDR1000 one, so I can't comment too much on the *actual* differences.
But I've been playing Forza Horizon 4 with HDR on, at night in a room that isn't all that brightly lit - I'm not talking candle light, just a single 80W equivalent LED light.
Forza has day/night cycles and in the night cycle it's pretty dark. The headlights reflecting off a stop sign is so bright it makes me squint if I'm looking at it. And that isn't a huge area of the screen at what I imagine is peak HDR400 brightness. I can't imagine anything being brighter than that - 2.5 times the brightness would melt a hole out the back of my head.
I dont feel like I'm missing out with peak brightness being lower.
AMD have released a statement on this. TLDR: A monitor that just reaches HDR 400 won't be good enough for Freesync 2. These existing monitors don't quite manage Freesync 600, which is the minimum AMD recommend for new monitors but these were designed before the HDR numbering system.
https://www.techpowerup.com/245533/a...dr-controversy
As for the mass of expensive components, I work with FPGA chips for video use, they are over a magnitude more expensive than an asic solution. All monitors have a scaler ASIC built into them to take serdes inputs from displayport and HDMI, store it in a frame buffer, drive the panel and handle the UI etc. By making FreeSync an open standard AMD's logic is just built into the scaler, it does still exist (hence you can't just get *any* old displayport monitor to run in AFR).
By insisting on an extra logic board Nvidia are duplicating serdes channels, memory controllers, ram and a lump of PCB to mount it all on. It is a horribly wasteful system, which only makes sense if you view it as a license dongle not as a hardware solution. Now that all high end scalers probably have FreeSync built into them as standard, the waste becomes almost painful.
Vesa HDR has multiple classes, 1000,600,500,400
1000/600 class displays are very expensive and beyond most consumers, all AMD are doing is mandating the basic minimum 400 standard as part of the spec, anyone is free to go higher.
The Samsung CHG70 series are one of the few Freesync 2 monitors on the market and is Vesa HDR 600 certified. LG are now selling a 32" Vesa HDR 600 display that costs over $1000.
400 is not good but until prices come down it's the smart move as the minimum.
There are currently 1 users browsing this thread. (0 members and 1 guests)