Read more.Quote:
Extra performance at the expense of absolute image quality. Yay or nay?
Printable View
Read more.Quote:
Extra performance at the expense of absolute image quality. Yay or nay?
I would rather run 1080p or 1440p native than 4k with any sort of upscaling.
I tend to tune the gfx settings of the game to maintain around 60fps at 4k and would rather do that than upscaling.
I play at 1440p and limit my framerate to 100fps, which is smooth enough for me. I'm not sure I've found a game that doesn't play nicely even with ray tracing with those points in mind. I'm sure it would be a different story at 4K though, so perhaps DLSS would come into play there.
Absolutely and especially whilst the functionality is taking up otherwise wasted silicon on the GPU.
Depends. If I'm playing Myst then no, graphics are paramount. Doom, however is fast paced enough to not notice a little degradation. In Monkey Island I can take or leave DLSS.
Ha ha I don't even know what either of them are!
Yes I am! Frequently just adjusting settings isn't enough to get good framerates, as often you'll need like 30fps and moving from medium to low will only grant you an additional 10 fps. Nowadays Game peformance reviews just show how every setting gives you one less FPS for every notch you go up, so having the possibility to gain a lot of frames with minimal to no degradation of quality is great in one quick setting adjustment is great!
Nope
TBH, I don't care.
If it gets to the point that I need it then I'll use it, but that's probably a long way off at the moment.
I doubt I'll ever use DLSS for at least 5 years............ you can't but a darn card to use it!
At the moment you might as well ask "do you belive in fairys?"
Sure, if results are acceptable then why not.
Save this question until we can compare both. M'kay?
Sure, but first I must buy a pair of GeForce RTX 3080s.
FSR as I still run on a 1080ti, if they make it available...
If the GFX cards is halved in price... ill buy a 3090, but right now it is just too much, if FSR is really good ill however would buy a 6900XT, they can be gotten rather easy here or so it seem also more value for the money.
Due to the ongoing GPU shortage, I stupidly sold my 1080 to a customer earlier in the year, and the only replacement I could find was a 2060. DLSS has been a huge boon in getting playable framerates on my 2560x1440 display. Whilst I'd prefer *not* to use it, the 2060 isn't really up to handling that res in high detail, so DLSS has come to the rescue.
Haven't tried FSR, but from what DF has said about it, it isn't quite the same thing. I think we'll have to see how it works in real games.
Having invested a second mortgage in an RX6900 XT I would have to go FSR..
Personally if an expensive GPU can't run games at 1440p and 4K natively,and needs upscaling/image reconstruction already its not going to have a decent lifespan IMHO. Despite the bleating from various tech channels,modern GPUs still are too weak WRT to RT,and AMD/Nvidia are trying to sell underperforming GPUs for premium pricing. DLSS/FSR is basically admitting the hardware is too weak to really push these effects. It reminds me of tessellation - the ATI 8000 series had it with Truform in 2001. It took until Nvidia Fermi/HD5000 series in 2009/2010 for it to be really useable IMHO.
1440p and 4K have been around for years,and you can monitors with these resolutions for under £200 now! Its more useful for people like me on older GPUs,and those who are trying to use newer entry level and mainstream GPUs at 1440p/4K.
Don't need a heck of alot for 1440p, the hurdle and grand myth still is 4K Native, also that there is 30XX variants that is not on pair or better than GPUS 2 generations ago is a big downfall as well, only new thing is RTX what?!? why I want more eye candy when the piece of crap can't run min fps 100 native with ultra settings at 4K. :d
Just look at CB2077 - it doesn't even use that much RT effects,and with an RTX3070 you can barely get 30FPS on Ultra with RT on, at 1440p native resolution. Without RT off you can just about get 60FPS(but lower minimums) on Ultra. The whole point of spending so much on a new GPU for gaming is to be able to turn up settings,but these GPUs struggle at launch without DLSS/FSR. On top of this,the RTX3070 has only 8GB of VRAM. It wouldn't matter so much if a 70 series GPU used to be £300ish,but now they are moving towards £500. A 60 series GPU is now moving towards £400. An 80 series GPU is well over £600 RRP.
Essentially with the move to proper RT,we are back to 1080p for most gamers again,or 1440p if you want to use a fancier way of using in-game upscaling.The RT effects are not as heavily used as they could be,because the hardware is not fast enough to do so - even Epic and other companies,are looking at other ways to do things in addition to proper RT,ie,Lumen.
So what happens in a year or two?? Compare that with Pascal,you will start to see that Ampere/RDNA2 isn't going to have a similar lifespan IMHO,unless you don't use RT or you use DLSS/FSR.It even gets worse when you look at the consoles - I don't think any GPU under £450 is going to convincingly beat a XBox Series X,especially after you start getting devs properly using the hardware on next generation only games(as opposed to jazzed up versions of previous generation games,which many channels don't seem to realise won't really tax them,and low ball console GPU performance).
AMD/Nvidia are pushing RT in everything but the hardware is at least a generation or two away to be able to do it without resorting to upscaling/image reconstruction. However,both AMD and Nvidia are using RT to push the price up of their GPUs,and then trying tricks to hide their subpar performance in RT. Nvidia has to stretch the truth and say "better than native" for DLSS.
Its why when Nvidia tried the same thing with tessellation,it was quite evident it wouldn't really matter for most mainstream GPUs(because by the time Kepler/GCN came along the mainstream GPUs were too slow anyway).
This is my big issue - sure I can get a rasterisation improvement over my current old GPU,if I get a new one. However,I won't be getting an increase in VRAM,and RT just runs like crap from day one and will be irrelevant in a relatively short time. So I question is it worth upgrading?? Its why FSR is more useful I suppose,because I think it might be more prudent to wait another generation with what I have(if the GPU keeps working,which is my bigger concern than shiny effects).
Yes absolutely. My goal is 60 fps and low W usage, so even if I can hit 60 fps, I will still use those to lower my power usage and keep my videocard cooler.
Would I use ...?
Yes .... depending on the situation.
I'm not a hardcore gamer (not any more, anyway) and so my perspective would depend on whether the difference with, say, DLSS on or off enhanced or detracted from my gaming experience, and (assuming it was available) what the cost of raw hardware would be.
If the hardware solution was an extra £1000 (or probably even £250) then my options would probably be, in desending order :-
- use DLSS unless utterly unaceptable, or
- use lower settings, or
- play a different game, or
- just don't bother.
If I struggled to tell whether it was on or off, then yeah, I would.
There are things I will spend serious money on (cars, camera gear, some computerbits, and others) and some I won't (like gaming cards, unless I also need them for something else). Though, these days, not so much on cars either.
If I had to.. yes, but even then I'm still waiting for AMD's ML Superresolution, watch this space ;)
But I'd rather have the power to run native resolutions.
Jensen told me that the new 3xxx series cards were 8k gaming cards (with pricing to match) straight out of his oven.
I've since found out they just about deliver 1440 res.
But, but but DLSS say the tech press.
No thank you very much.
Oh, absolutely, despite my comments above .... provided the premium wasn't too big, and depending on the impact of the likes of DLSS on the enjoyment of the game.
Put it like this - I would prefer native res. Who wouldn't? But there is an element of bang-for-buck in that. A hardcore gamer would be prepared (funds permitting) to spend MUCH more on that than I would.
I used to utterly bemuse some of my friends with my choice of hifi components? Stax headphones? How much???? Mitchell turntable? HOW MUCH !!!!????? And so on. They might spent a few hundred quid on a system, but I'd spend that on a cartridge .... and I mean, in the '70s and '80s when a few hundred quid was way more than it is now. A Capri 2.8 Injection Special was, like, £8.5 to £9k .... and in some areas, you could buy a HOUSE for less than that (and I don't mean a dump, in a dive of an area, but a nice if small place).
I was prepared to pay quite a bit for increasinly small incremental benefits in (my perception of) sound quality. So I think much that same logic applies to using DLSS or buying (currently, at least) way more expensive hardware. It all depends (for me) on how much difference there is in visuals between the two and from what I see in reviews, it's not much, most of the time.
"Not much" might be enough for a serious gamer, like it was for me and hifi (anyone know where an ol' fart like me can buy a decent ear upgrade? :) ), assuming that serious gamer can actually afford the hardware, of course. But for me, it's all about how much marginally inferior graphics affects my enjoyment of a game? If it kills that enjoyment, it would be a non-starter. But if, while I can see it if I really look for it but it has little or no effect on the game enjoyment, then DLSS is a good option.
And of course, just as (sadly) my hearing isn't what it was for music appreciation, my eyes and reaction times aren't what they were for gaming, either, which would also impact of how much my enjoyment would be impacted. Getting old is a ..... rhymes with itch. ;)
The problem is modern GPUs,are giving you almost less for more. So a top end GPU 10 years ago,would be the equivalent of £500~£600 in todays money. The equivalent is between £1300~£1500 RRP now,and that isn't taking into consideration street pricing,mining,etc which has further jacked up pricing. This has also meant mainstream and entry level GPUs are increasingly going up in price,and seeing worse progressions from generation to generation. That means a mainstream GPU will have even a shorter lifespan. DLSS/FSR are making up for mediocre generation improvements over the last 5 years,and AMD/Nvidia trying to push RT when its really too much for the hardware. This is a cheaper way to increase their margins.
The issue is even at launch,these GPUs can barely run taxing games at qHD/4K at 60FPS,with RT on,etc. This is why they need DLSS,FSR,etc to render internally at a lower resolution. So what happens in 2 years or 4 years time? A lot of hardware enthusiasts on forums upgrade every year or two,so don't see this issue,and throw money at their hobby. But its a different kettle of fish for most normal gamers who don't change hardware anywhere as quickly. These GPUs won't be suitable at running any of these RT effects to any degree by that time. IMHO and you will need to switch them off defeating the point of spending so much. But AMD/Nvidia are using RT to say these GPUs are worth so much. Yet PCMR was mocking consoles for doing the same,but at least these are cheap.
So at this point,one has to question whether its better to just ignore RT entirely,and stop chasing better looking graphics. The latter was only viable because ATI,Nvidia,etc not only made GPUs to fit all pockets,but also we had very good improvements at each generation. However,now they all want to be like Apple,so have decided to act like cartels and just push the pricing higher and higher.
The difference with a £1500 pair of headphones,its going to sound great in 10 years time,and be competitive with newer equivalents. It will have a resale value. A GPU will be a lead weight,within 5~6 years especially as they will lack hardware features and driver support. In 20 years the headphones will still be a thing you can use. IMHO, a £1500 pair of headphones,a turntable,amp,etc are better value for money.
I've been using DLSS Quality at 1440p with a 3080 when ever it is supported. It looks fantastic with the newer DLL, 2.2+. In comparison FSR is not looking so good at 1440p and becomes unuseable at 1080p on my laptop.
That's a good point. As the local knitting club likes to say, life's a stitch ;)
There's also the possibility that sacrificing resolution IQ gains rendering budget for IQ benefits elsewhere - maybe lower res + RT reflections is better than higher res in some circumstances? Who knows for future IQ advances too.
But for me right now and the kind of games I play, texture/geometry detail is pretty much king. I even accept some performance loss on games like Forza Horizon 4 so that my landscapes look super crisp with lots of anisotropic filtering and negative LODs etc.
I would only use the upscaling to allow me to use ray tracing.
I only have an rx480 so if any games give me FSR I'll take it (Essential when I have a 1440p monitor). No point in buying a new GPU until the prices become more sensible.
I play at 3440x1440 on a 5900x and a 3090 and use DLSS if available. I can't see any difference in the gfx when the thing is turned to quality, but I do see a slight improvement in frame rate.
It's basically a free win, I can't understand why would anyone not take it.
Yes I use it...and why not?
Most gamers - i'd wager over 95% - simply cannot tell the difference between DLSS on in quality mode, and native resolution, in-game. Sure you can take screenshots/pause etc and hunt down the detail differences - but in reality that is not how we play games, and it generally looks good enough for people not to care or notice. The most common phrases on all of those "compare DLSS on and off" videos are along the lines of "If you pause here and look closely"...which says it all to me imo.
I personally don't care if a game needs DLSS to run at a respectable framerate with all the eye candy turned on - does it look good, run well and play well? yes - in which case i'm happy. That's what most care about.
I have a much bigger issue with games that force "Film Grain" and "motion blur" effects on us to hide performance/engine issues - those two settings make games look worse and affect everyone, regardless of your GPU ;)
Save your money - Linus already did a test of his staff, most of whom had no idea which versions of the games had stuff switched on or off. IIRC, Anthony got full marks, but only because he knew exactly what to look for and got right up close to see.
For me, my screen is 1440, so as long as it runs in that and looks good at reasonable frames, I'll use whatever. Basically, if I can't detect the differences without having to use monitoring software, it's all good.
You have a GPU over £1000,and it is so weak you need to use upscaling/image reconstruction from the get-go?? I honestly thought an RTX3090 wouldn't need it,and here I was complaining about the RTX3070,etc not being great at it! :( So what is the point for the rest of us if RT is so poor on modern hardware,that the most powerful GPU can barely run it properly. Wow,that means an RTX3090 is going to struggle in another year or two once RT effects start getting more intensive,and AMD is even worse.
You have basically put me off wanting to buy a modern GPU from this generation for RT. I was already a bit marginal about it as you can see,but its cleared any doubt in my mind now. Thanks - I think that has made me finally decide I can wait another generation of GPU,unless my hand is forced. I already know my mates RTX3060TI struggles with RT,unless they used DLSS in some way and they were not that happy with its RT performance,and that cost them nearly £450. Anything under that is sub console performance anyway.
Without RT and some fiddling my old GPU seems OK enough to last another year or so,if the GPU keeps working. I might have to avoid the odd game or two,but I think its OK. FSR might also help a bit. Its no point when even a mighty RTX3090 is running out of steam now - I tend to keep my GPUs for years,and even then can't justify nearly £1500 on a GPU,and its not like my SFF PC can take such a beast!
Roll on RTX4000 and RDNA3!
I said this in a previous thread on a similar topic but I don't get the hate for DLSS and don't see why you wouldn't use it if you have a card that supports it.
We're at a stage where DLSS Qual can produce better than native results with less comparative GPU power/RAM and unlocking stuff like RT (where performance is inherently tied to resolution). Since time began games have been running effects at a fraction of screen resolution/lower precision because running them at full tanks performance for little to no effect.
Every modern game engine will be support DLSS (and FSR) going ahead, and even stuff like UE4 and a built in TAA as part of the rendering pipeline which can't be turned off. This isn't going to change going ahead, as evident with UE5 and its temporal upscaling solution that's built in - these are a world away from the crap shoot solutions in Far Cry 3 and the like years ago.
Have you noticed how the tech media,AMD,Nvidia,etc were all pushing those decent AA effects a few years go. But with RT,suddenly all mention has gone away??
10~15 years ago ATI/Nvidia did try and make "adaptive" image quality for their GPUs,since they believed it would boost performance a decent amount for a small visual loss. Yet PCMR and the tech media called them out for cheating.
I even remember some of the tech media,and enthusiasts hating on TAA because of its console origins. There are more PC focused devs who refused to use TAA,because its frankly shouldn't be a primary AA method in PC games.After all TAA is a very console focused AA method,because consoles tend to use lower internal rendering targets,and insufficient processing power. The blur is there to hide the jaggies of low resolution rendering from a distance. Its a peasant level AA method,made for weak systems. The only reason its in so many PC games,is because consoles are increasingly the main development platform for said games,and the devs CBA spending any effort pushing other methods.
This is also why relative comparisons are big problem. People don't step back and look at what is actually capable in games as a whole.
Its like those sales people at PCWorld/Currys who have two TVs,and then fiddle with the settings on one,and show how one is better than the other. Then they try to push you away from looking at the other TVs.
Companies knows this relative marketing works. Look at PhysX. More attentive people started to show even back then,Nvidia was actually removing simple CPU physics effects,when PhysX was off,but then when it was switched on they suddenly turned up,together with the other effects. People also forgot games such as Red Faction,which did insane physics effects,just on the CPU.
Before and After is a classic marketing tactic. QVC uses it,the local person at the market uses,etc.
Even with DLSS and FSR hype,it makes me laugh. A few years ago,the same PCMR was massively against upscaling/image reconstruction,as they were slagging off consoles when Sony/MS promoted 4K TV gaming with checkerboard rendering,etc:
https://venturebeat.com/2016/09/08/p...doesnt-matter/
On so many enthusiast websites,PCMR was laughing at consoles being "fake" 4K. Now the tune has changed because its now in vogue with PC. So where are all the articles now saying the current GPUs are not "true 4K" RT GPUs,because they need upscaling/image reconstruction,etc?? Upscaling is a benefit for PC,and a con for consoles? Hypocracy comes to mind! ;)
A few years ago it was all about 4K,120FPS,HDR,native resolution rendering(unlike those consoles and their "fake" rendering),supersampling,and now its back to 60FPS,RT,upscaling,etc.
Is DLSS and FSR useful additions to the toolkit? Yes,they are. But what I don't understand is the utter hype from enthusiasts,like its the second coming of the creator. Upscaling/image reconstruction methods have existed in PC games and consoles for years. Its almost Apple like in some ways now.
Yeah. Partly that's not their fault - the common way to render games these days unfortunately rules out a lot of AA methods without significant dev work, and the choice tends to be between FXAA/MLAA type methods (which FSR is close to), TAA, and full on super sampling. Hardly anyone has the budget for super sampling (but anyone can enable it via desktop resolution/control panels). FXAA methods are poor.. so TAA is the best of a bad choice.
Yes,but the issue is they could talk about it. Its quite clear a lot of PC games,are really console ports now,not only in the use of TAA,but even down to certain aspects of the design(and dumbing down compared to their PC only ancestors),or even stuff such as the control systems,or lack of control customisations,or controller focused combat systems. Its only some of the smaller,independent reviewers who see to highlight this now IMHO.
But they chose not to,especially with all the nonsense "better than native" rubbish,which is almost Apple level marketing. Then reading the marketing documents,and thinking upscaling and image reconstruction are separate things.
Also,how some well known review sites,then have buried some of the issues with image reconstruction methods especially in motion. Only a few good ones,have started to look at the claim,and realise it comes with "caveats" but its drowned out by all the others who CBA. Basically selling you less for more,but thinking you got more whilst reinventing the wheel. Apple is masterful at doing it.
I have seen so many instances where these reviewers,seem to not even report accurately what their own comparison images show,or even ignoring motion artefacts and this is from some major reviewers. These people work with things such as PS,etc so you can basically have a clue what AMD/Nvidia are doing to make the images "look better" and they are just trying to overmarket something as being greater than the sum of their parts.
I get the argument in the entire post, but in reference to that point - absolutely where I am.
I'm willing to fork out a reasonable chunk of change on a GPU (and "reasonable" is a subjective term, but it's not, in my case, about gaming. It's more about video encoding/transcioding.
WRT gaming, :-
- I'm not a hardcore gamer these days, and
- I'm much more about actual gameplay than fancy graphics.
By which I mean, I'm not particularly into action/shooters, and, to reference classics, just as happy with with a Myst/Pirates (Threepwood) as with Quake, and in either case, I'd rather have good plot, puzzles and/or humour, than fancy graphics.
And I know, it's a very personal perspective based on my interests but, yes, I most definitely am not chasing better graphics. That's why I'd used DLSS etc provided it wasn't a huge compromise (and from what I read, it isn't) than throw lots of extra wedge at high-end cards.
To put that another way, I can afford to throw a fair bit of cash at tech toys, and am willing to provided it is a high enough priority, but can't throw lotsa wonga at every tech toy. It's an "opportunity cost" and diminishing marginal benefit issue. Gaming graphics just aren't a priority, or rather, not high enough a priority. There's a long list of higher priorities, and sadly, longer than the level of cash needed to indulge them all without limit.
I'll use whatever gives me the best image quality/fps/responsiveness combo. I think we all would, even those claiming they would never use DLSS/FSR will do quietly if they found it worked for them.
Don't get me wrong FSR/DLSS/TAAU are useful technologies,but my main concern is that we need to use them from day one on expensive GPUs. Its one thing if you are needing to use it in a few years,or are buying a £200 GPU and want to game at a decent resolution.
But if its the case of having to spend £400+ and use it from the get go,it means the GPU is already going to struggle with the effects over time. If I can't run the effects for a large percentage of the lifespan of the new GPU,I might as well just use FSR/TAAU/ingame scaling on the old GPU I have and be done with it.
I think the main issue is I intend to keep the GPU for a few years,and many just upgrade so frequently before it becomes a problem.
Why would you not use them if you can't run at max settings at native resolution? A blank screen at 40000k is still a blank screen, resolution =/= detail.
If it's a small difference in game settings then sure go native, but if you have to set everything to low anyway then using combinations ofupscaling and tuned settings gets you most of the way there and looks so much better. No way in new games 4k native on low settings looks better than 4k@80% upscaled on max settings. When you turn the settings to low you remove whole details and effects not just reduce texture resolution, so how is seeing low res, detail missing textures @4k native better than seeing much higher res and full detail textures @80%4k upscaled? Think about draw distance, on low you get lovely close textures and a few step away they turn crappy with no details at all, on high it goes for miles regardless of the resolution. I want immersion and realism, not the warm fuzzies knowing I'm seeing less impressive images at a higher theoretical resolution.
I'm still using 1080p gear at the moment so I don't really need it but both give impressive results. I'd have no problems using either. First proper taste will probably be when FSR hits the new consoles (and I can get hold of one!).
Considering I'm still on an ancient GTX 1080 (non-Ti), anything 3000 series is unobtanium and my card struggles with running modern games in native 1440p without dialing down the settings a lot, I'd gladly use FSR to effectively run games in 1080 and upscale to 1440 if the results are decent enough. It might also help my 2nd/old PC to linger around a bit longer with its GTX 970, if FSR manages to get games that would normally struggle to run up to playable framerates (read: close to or exceeding 60).
DLSS being RTX-only, I simply don't care about at all ("yay more FPS for people who already blew tons of money on a new GPU! oh you're on an older card? well forget you then :D")
There's already a mod for GTA V/Online to cobble-in FSR support and it seems to do pretty well, even if it's not official and therefore probably not as effective as it could be. Sadly one of the games I play the most (CoD MW19) and which could use some help to get running better only supports DLSS so it'll continue to either run like or look like rubbish for me unless I shell out more than what my entire PC is worth for an objectively pretty mediocre RTX 3070 or something...
Not exactly fun times for PC gaming in general IMHO :|
edit: didn't think this forum was full of wusses WRT language. why bother putting effort in posting here at all if I get warned left right and center. screw this "family friendly" clown show.. I'm out
Depends how bothersome the visual artefacts are really. DLSS on Warzone for example isn't that great, there's a fair amount of ghosting and blur on distant objects, something I can't tolerate for long especially on a fast-paced competitive game.
I've read that 2.2 apparently fixes these issues but there isn't an official update yet, so in the meantime I'll just stay on native res which my 3060 Ti comfortably handles anyway.