Read more.AMD's version of DLSS will work on older GPUs like Radeon RX500 and GeForce GTX10 series.
Read more.AMD's version of DLSS will work on older GPUs like Radeon RX500 and GeForce GTX10 series.
Erk.. hope it looks better than that video. Not impressed, but maybe was just youtube funkiness.
These upscaling methods look far better in still images,but all exhibit trailing/blurring artefacts in fast paced movement scenarios.
Brilliant move by AMD... offering DLSS-like functionality for the Geforce GTX owners that NVidia abandoned.
Hoping this plays out similarly to GSync vs FreeSync - as consumers definitely benefitted from AMD bringing competition to NVidia (I benefitted and haven't owned an Radeon GPU for ~20 years).
Sort of to be expected... it's a compromise by design, although I can't say I see any trailing/blurring on my system with DLSS enabled.
I'm far more sensitive to uneven frame-rates than to rendering perfection.
Whilst there is a difference between native 4k and DLSS-generated 4k - it's not much of a compromise at all to get really fast fluid frame-rates that most GPUs would struggle with.
The benefits of DLSS and hopefully FSR (for me) is preferable to switching the resolution down a notch or two.
Last edited by KultiVator; 01-06-2021 at 04:48 PM.
Bit disappointing that it's dependant on AMD and the game designer supporting it and not something that's more universal, something like this requires a certain level of commitment that there's no guarantee of having in the years to come.
We'll have to see what it entails, but I imagine, like nvidia's solution, you need to have access to high resolution assets further up the pipeline/offline. Developers are the obvious route to getting that since they could do the training on assets that aren't necessarily shipped with the game (for eg. very high res textures which are too large to distribute), otherwise you're limited to just supersampling on the client which will render geometry at higher res but not necessarily textures.
But it might be possible to, say, run a training set as part of game installation. I remember games used to do that when texturing was just beginning to come in.
If it works on the consoles I think it will quickly become the standard and unless the developers are getting cash drops from Nvidia I can see them not bothering with DLSS.
What makes it harder is it's not the same image side by side and the sudden missing NPCs can distract from making a side by side comparison. Unfortunately, as Godfall is a game where no two runs will be the same, this can make it extremely hard to compare. In the semi-static 4k scene it does look very good but on the 1060 test from 3:00 onwards, there were instances where there was a noticeable fuzzing across the screen. Will definitely need more side by sides rather than that short clip.
But even so, a slight quality dip to maintain keeping your GPU running for longer, only time will tell if customers are happy with that.
That’s how I perceive things as well... each new set of NVidia RTX/GTX drivers add more FreeSync monitors to the list of those officially supported.
Console support for FreeSync is also a significant driving force, as is AMD’s influence over the Vulkan API and tooling, so plenty of FSR support and adoption is likely to ramp up over the next year or so.
TBH the 1060 demo looked awful. It's not youtube image compresssion when the FSR off side is considerably sharper.
1) The rock textures and fidelity on either side
2) The vegetation over hanging the bridge
3) The chess pawn shaped object either side of the arch
This is on 2nd from highest image quality mode? I dread to think what balanced and performance modes look like - you'd be better off just setting the render scale to 80-85% based on this image.
![]()
So my intel HD will do 100fps 720p while playing Witcher 3?
There are currently 1 users browsing this thread. (0 members and 1 guests)