Read more.Quote:
Are you eager to get a taste of real-time ray tracing, or are you inclined to wait and see?
Printable View
Read more.Quote:
Are you eager to get a taste of real-time ray tracing, or are you inclined to wait and see?
You ask such things on a forum like this?
Yes, of course my GPU must support RTX, along with 64k resolution and 8,000fps at a million Hz on a 50" display, with all the (optional) RGB decorations you could want... silly question.
But... The important part is that RTX, and whatever hyper-bole I just typed, is implemented properly.
At the moment, everything in RTX games is shiny as heck, to show it off and prove it's RTXed. It's like 3D where everything is in Comin' At Ya full on, in-your-face 3D, presumably so you feel you're getting your money's worth.
Once it's grown up, moved out, gotten a girlfriend and maybe settled down to being a quiet background effect, then it will be of real interest... and possibly even affordable.
No. I'm not interested in ray tracing but if I get a GPU that supports it, I'm okay with it. I'd rather have raw performance to drive my high refresh rate monitor.
Not from Nvidia at those prices
No. The majority of people wanted to be able to push higher framerates for VR and 4K gaming and Nvidia delivered reflections and a slight performance gain for twice the price of the previous gen cards.
I wasn't interested in it originally, but upgraded from a 970 to a 2080ti. Shadow of the the tomb raider and others sure look pretty with the rtx features on. Sure they're not the be all and end all, but it does make a difference in the right game.
Not worth the performance / price penalty maybe, but that's the way of new tech. Hopefully performance will get better with better software / driver support.
No...
Not at the moment. Maybe in a couple of years time when the tech has matured a bit and is available at an affordable price.
My current one already does, and looking at the fact that the next generation of consoles will support it also means more and more developers are going to run with it to make their lives easier. Anyone who thinks their next GPU doesn't need ray tracing are going to be changing their minds in the next year or so. No matter how much you argue it isn't needed, we are getting it and you are going to need hardware to support it.
Well by the time I next upgrade I imagine the card will support Ray Tracing. It may even support something else by the time I get around to it.
For me it's one of those things that the higher the resolution the less impact the setting makes. The fidelity increase that a resolution bump gives tends to trump most fancy things.
But even if it wasn't, I just don't see it as anything worthwhile. It's really not all that impressive, especially when you factor in the cost, both financial and graphical.
yes as long as it supports 4k60.
I'm playing games less and less as I get older and upgraded only last year. If I am still playing games in 3 or 4 years time, I expect the new card will support ray tracing and some other feature will be new and expensive.
Yes, I want RTX. Just because the computational power isn't there right now doesn't mean that RTX isn't the future. We only need a performance increase of 4x for 4k so it'll be 2-3 generations for the top end card and 3-4 for the mid-range card. And 1080p should be RTX heaven in the next generation.
That NVidia marketing has done a brilliant job in brainwashing people to believe that RTX == real-time raytracing. It isn't - it's a clever way to take a minimal implementation to deliver select effects on top of largely rasterised graphics without allowing the noise and inaccuracies from the minimal implementation to be too obvious.
Real-time ray/path tracing may be the future, but the current "RTX" implementation is only an initial stab at it with the current limited resources that can be economically (debatable, at that) applied. It's just smoke and mirrors at the moment, and I'm not prepared to pay for it.
In the meantime there is still value in improving rasterisation performance - frame-time consistency, lag and minimised jitter are extremely important; turning up eye-candy on an inadequately capable rig causes major and minor stutters that cause me to feel motion sick, and RTX is something that makes this worse and not better.
Ha ha whats Ray Tracing?
Mine does, but I'd rather they make it so the RTX 2060 is SLI enabled so I can get another at some point. I know one more powerful card is better than two less powerful cards, but there's something about multi GPU that I just like
Does it "need" ray-tracing? No.
Do I want it? Not if there's much of either a price premium or performance hit.
Would it even be on the criteria list on which I based the choice of next GPU? Frankly, no. The only thing I can conceive I might use it for is gaming, and I have enough other things that give me a problem with modern gaming for R-T to even be a slow-moving and extremely distant blip on my "needs" radar.
No, I couldn't care less, modern engines do a perfectly adequate job of faking realistic lighting for the purposes of playing games.
Not going to lie, after seeing Minecraft with the raytracing mod I do want it.
Ray-what-now?
As of this moment, no. When it's time for me to upgrade a few years from now, I'll reassess the need.
Yes, it already does. I think ray tracing is the biggest advance in graphics in years. Obviously more games need to start supporting it.
Yes it will due to being (probably) a 4080 or 5080ti (depending on when I choose to replace my 1080ti (which laughably will do RT).
Will I switch on RT. Of course, I feel sad when I look a the puddles in BF5 without it :)
At the moment RTX is popcorn with no butter. I really want too see 2080 ti with 16GB-32GB of vram of gddr6-hbm2/3. Having more vram would help more than fancy lighting.
Yes, I am already spoilt using it in games on my 2080Ti.
It is the future, Even for consoles.
Would love to see it from AMD,Intel and Nvidia using MCM in future.
Have a separate die for ray tracing,Another for A.I. attached to the main gpu instead of trying to cram it all onto a massive die that costs a fortune.
no ... I'm not paying for something that's not quiet there yet ... what do you think I am a Nvidia fanboi ?
Yes, but only because 1080p/midrange cards don't exist anymore.
Hopefully, by 2022, somebody will release a midrange card, and by then I assume raytracing tech will have matured enough to be feasible. That's when I'm getting a new card.
I'm happy with my rx580 for now
It does if you want me to buy it... :) Everything coming (pc, console etc) will have it, all engines etc. At this point, I want to enjoy everything coming. I can't do that without RT, DLSS, VRS. Get all 3 or I go NV 7nm next.
Will I like how they use it in EVERY instance? Of course not, as it will take a while for devs to get stuff right, and figure out how the hardware all works etc. IE, BF5 dev said it only took 2 weeks to speed up 50% after they had time to work with the drivers and they said they had MORE perf left to extract. NV says they are working with every dev to get it into PC games. Again, sign me up, even if I only want 1/4 of the games (25 or so coming now) currently in dev, never mind ALL that will come for years. These 3 technologies are the future, it's just a matter of figuring out where/how they work best in your games.
You would have to be selling cards at a MAJOR discount for me to dump these features. Which is why AMD should have used more silicon and put it all in. Whiners can complain all they want, 75% of the cards on steam are NV for a reason. It is not PRICE. Gamers PAY for perf and features. How the heck do you think HEDT started? Yeah, some of us pay MUCH more, and NOT because we're rich. CPU, different story, I'll take AMD probably, though still pissed about gaming (how long until you get this right? AGAIN go BIGGER DIE!). At 7nm, AMD should have lost ZERO use cases, games OR apps. PERIOD. You should have designed to BEAT intel in ALL games. PERIOD. You already have apps, fix the dang games. All the reviewers have to run 1440p/4k to hide your weakness. Still on the fence until I see 570 boards drop I guess. BF this year will be fun, but until then I can wait. See how that works AMD? If you had WON games (like everyone I know expected...ROFL), I'd have already bought 2 with another 1-2 for black friday. Now? Uh, I can wait. I would have taken a chance on the board if AMD won games, now I will wait out some homework on who's best etc, for what I want. By xmas, we'll be on 570 v2 boards with all the teething crap fixed most likely :)
Quit aiming at poor people AMD. Always go rich, BIG, first. I don't pay premiums for MISSING features, or GAME losses. For those two PROBLEMS, I expect bargain bin prices, which would REALLY hurt my stock AMD (LOL). Please start acting more like NV/Intel. AMD is getting there, but still more work to do, as the gpu launch is a joke if your 7nm doesn't EAT 2080ti 12nm for lunch. At worst you should have won EVERY game at like watts. Anything less, you get PRICE CUTS pre launch...ROFL. Do the math, WIN or lower prices. Want ASP's up? WIN. PERIOD. Want steam survey to change from nv 75%? WIN. PERIOD. That means GROW THE DANG DIE size! Savings here is peanuts compared to PROFITS gained from WINNING and charging premiums because you are WORTH IT.
No, I have absolutely no requirement for ray tracing or any high powered GPU.
RTX-compatible? No.
Generic DXR-compatible? Possibly.
Regardless, I intend to stick with my RX Vega 64 until a suitable upgrade is available at a reasonable price (~ £300 GBP) with further features. At present, for the games I'm playing, I'm happily gaming away at 60 FPS at either 1440P or 2160P.
DXR will only be relevant for me for games that presently don't exist. Furthermore, even with a 2080 Ti, gamers aren't really getting a decent ray tracing experience. Existing cards just aren't powerful enough to do this in real-time without serious drops in framerate.
To be honest, I'd settle for smarter fake ray tracing in Unreal Engine 4 or Unity.
Not a half assed implementation like RTX that costs us 50% performance for barely any actual improvement in quality.
Once the technology, software and financial sides are all ready, then it will be welcomed. Its too soon for ray tracing in real time.
Erm, No.
Ray tracing cards are far too expensive for me right now, and I'm to skeptical of all the hype. Maybe in another generation or two.
I don't care about ray tracing at all, the speed of the card in general is what matters, and my next card will be purchased based on how well it'll run Cyberpunk 2077 :P Hence I shall be waiting until then to bother upgrading, if indeed it is actually needed!
Maybe my next one as I have no plans to upgrade for a few years, by then it should be common place and hopefully not carry a premium.
I can't afford/justify ray tracing, will be affordable one day.
I couldn't care less about ray tracing and given the cost of the RTX cards it's definite NO for me.
Hi and Hello
I am more interested in what will come next that is similar to, and it will arrive sooner or later if not a better method is found, going for direct support only from one brand causes some limitations and use as consumers whether industrial or private or whatever got to tell the companies that they are not going to gain any edge unless they start supporting same features, Nvidia or AMD does not suddenly invent the wheel again whenever they pound out a new product, it also gotta be userfriendly and be able to support a more wide standard rather than this or that additional piece of special equipment we have to buy to get the last 5-10% performance or the likes.
If RTX is anything to go by then I'd rather it didn't TBH. Visual effects are IMO disappointing to downright silly, performance is terrible, and die cost is beyond unreasonable. Maybe future nodes and architectural improvements will help with all of the above but I can't help feeling Nvidia's implementation will be at the expense of raster performance for a given die size.
So I remain open-minded for the future, but as it stands, nope. There's little logic in buying something when it doesn't work 'because it's the future'. And either way, similar to what others have mentioned already, brand-locked nonsense repeatedly and demonstrably fails in the GPU market, so I'd be more inclined to wait for the standardised implementations which don't require specific hardware which may well not exist in future GPUs, breaking backwards compatibility, and being limited to a handful of sponsored games.
Huge performance and price penalty for a bit of eye-candy. RT is the 2019 version of Hairworks.
Sadly there is no shortage of idiocy, and that fact only encourages Nvidia.
By time raytracing is actually needed for an optimal gaming experience all these RTX cards would be obsolete.
High resolutions & frame rates matter more at the minute, being able to push up to 4k or high refresh even on mid tier cards is vital.
I'll wait till mass adoption, and that means next gen consoles being released. When they have it, I'll look at getting a card that supports the same implementation they use as thats what devs will target. Even then its less important to me than being able to drive games at 60+ FPS at 1440p as I'll notice that much more.
I always think it's fun to read posts like these using only the ALLCAPS words (not including abreviations,) so without further ado:
Which does still carry a lot of the original message tbh.
Owning a 4K monitor and a 1080 ti, and having to dial some games down to QHD for playable framerates, I'd rather give ray tracing a miss for now, instead of sacrificing even more performance. I think I'll wait and see if AMD brings a RX 5800 out.
No. I just want performance, games look great at the cutting edge using traditional faked world lighting unless you're sitting still admiring the shiny dirt for hours on end. Big fuss over a reasonably minor IQ blip which isn't even efficient yet.
Maybe not NEXT, but it will eventually
It's affordable now, get a better job. If you can't afford a 2060 yearly (heck, a whole PC if I'm frank with you here), really, quit burger king and get a real job. Or heck, just quit cabletv or your cell bill and you're gold (or join something like TING etc). I quit both, my PC is free every year if wanted :) In my defense, as a PC tech (who is forced to use a phone daily...LOL), you hate when your home phone rings in most cases :) I'd only carry a cell for on call these days when forced. As for the cutting the cord for TV? No need for defense, only fools/rich still pay for this. With a roku and some pay channel (ONE at a time) like netflix, hulu, hallmark (best deal IMHO), etc, you really can't watch all that is there unless you don't work and are watching tv 24/7 and I think it would be tough even then. I've bought full shows (all seasons) for multiple TV shows for my parents for xmas yearly, and they don't even get time to see them as retired people. They keep saying "we're saving them for when FREE stuff runs out, or we tire of hallmark"...LOL. It appears that will never come, and I know they want to watch the shows, they just want all the free stuff they can get first thinking one day, it won't be there...LOL Yes, I tell them how dumb that is, it isn't going anywhere, and with all the people trying to get your eyes there is more and more free stuff monthly. They just switch seasons, shows etc around from channel to channel as new ones pop up and pay for the seasons again. IE, felicity from tubi to ABC, & roku had all seasons of quite a few shows that just got hacked to 2 seasons - seems they're getting picky now, but have no idea why when it's all ad supported who cares? Still a LOT of shows on roku channel that are full though.
Do I get to watch WHATEVER I want, whenever I want? No, but show X, looks exactly the same next year, or whenever I sign with a channel that has it. Binge and switch yearly I say. We may try britbox next, but still have a lot of hallmark to get through for now, never mind all the free seasons (or complete shows) on roku channel/tubi etc. Even ABC etc have a few like Felicity (full show, just moved from tubi recently, was full there too). You can get more from CW (the free channel has a lot of seasons) etc. Now most even post the recent ep's if you like that sort of thing weekly. Again, unless you're rich, or just enjoy pissing away money, why pay for cableTV? We thought we'd miss FOX news, but even that is on youtube LIVE daily (jujubot or gramp's house...ROFL - many others for hourly stuff if you miss anything). FOX not your cup, well CNN, MSNBC etc out there too daily. Paying cable...I digress...Save $500-1000 a year easy and buy your new card this year, cpu/board/mem next year...etc, you should get the point...ROFL. It's exactly what I've been doing for almost a decade. Heck our whole family has cut cable and even cell down to just minutes we use, no contracts ever again for us.
Some people seem to miss the point of DLSS. It gives RT for free (at least some of it), at least until you bloat the game with even more RT objects/surfaces etc which can obviously reverse the point at least until you add more cores to reverse again...LOL. You just need to wait for them to turn it on in games with RT, and fps will be going up. We're just getting patches for RT, & DLSS isn't far behind for those same games (see BF5, feb patch gave it, april improved sharpness of DLSS specifically etc). It's a balancing act until they figure out how to effectively use new tools.
https://www.tweaktown.com/news/65470/battlefield-update-improvements-dxr-dlss-features/index.html
Surely more after that, just a quit google for BF5 dlss (and don't care about the game). As noted it takes time to figure out brand new hardware features. Heck, Dice had to tell NV they had hardware that wasn't working yet. They released a driver, and boom massive uplift for dice as they were now able to take advantage of what was there but useless at launch. That should describe the next year most likely as current games are NOT made for this tech, but rather patched in after. Once the titles come that are MADE for it, expect less teething crap (even for the driver team who apparently left some hardware off...LOL). As usual, if you're a first adopter, expect patches, growing pains etc.
One more just to bring the point home:
https://www.techspot.com/downloads/drivers/essentials/nvidia-geforce/
July driver, again, showing better DLSS for BF5 (+ metro ex - er, OPTIMAL this time...LOL, ok.). Training is clearly a work in progress, hence the word training. It doesn't help that the game itself isn't in stone, as they keep adding more reflections, rays etc which even as NV (or AMD at some point) speeds things up, you might not see much as details are amped soaking up all the perf anyway. I guess that's where your slider or dlss 2x comes in etc. No law that says you have to turn on everything they did.
"The majority of people wanted to be able to push higher framerates for VR and 4K gaming"
Uh, less than 2% use 4K (1.56 or so last I checked...LOL) and ~5% use 4k+1440p (both total!). VR? Don't make me laugh as that is less than 4k users. Maybe when it looks like google glass, not a scuba mask and far later when REAL games get made in mass. Wake me when 4k hits 10% and your comment will still be far out of whack (only 2.5x more for 1440p to make 10% ROFL). What does the word MAJORITY mean to you? 1.5%? The majority of people (65% according to steam) are playing 1080p and it's not a small sample size either, as they have 135mil users. The majority of users don't own a card capable of 4k (hasn't been made yet, without turning crap off all day, or buying MULTIPLE cards), nor do they own the monitor yet to run 4k :) Sorry, sales don't lie, neither does steam survey. You're an early adopter if you're on 4k and your wait for the rest of us to catch up will be LONG. See how long it's taken 1440p to get to...wait for it...3.5%...ROFL. FYI Ryan Smith at anandtech said it in his 660ti article (1440p is the new enthusiast norm...ROFL) and he is still wrong today. Never mind how ridiculous 4k statements are. I have ZERO plans for 4k monitors, and we have ZERO 4k TV's in our whole family though we may buy one for black friday, but only because they are commodity at this point and finally there is quite a bit of content and not much of a change at <65in IMHO (how close do you sit?).
My next TV might be 75in+ and in stores that seems to show something worth buying at our viewing distance. Though I popped in a 1080p HQ rip (not the crap settings reviewers use!) on that TV and surmised SOME of the quality was simply that it was on a more expensive unit with newer tech overall and I was comparing samsung to samsung (2017 65in model to 2019 75in). I can live without 4k for a lot longer with all tv's under 65 to date for us. Opinions vary but not on the word "majority". I thought that ALWAYS meant >50% or at worst the greatest number of a group of other numbers (surely not 1.5% with 65% on 1080p ROFL) :)
Well, being just some underpaid peasant flipper at Burger King, I normally would resort to physical violence... But I'm trying to be a bit nicer this year and save the Mods some headaches.
Presumably he just got a pocket money raise and therefore knows everything, so I'll just sit back and let him impart his unequivocal wisdom on how I should run my miserable little life...
Reported for slinging about that 'getter better job' drivel again. No attempt was made to read the rest of the ranty walls of text after he destroyed his own credibility by opening with that.
You sound experienced. With him I mean. I don't spend much time on the forums nowadays, is he always this much of a bellend ?
It seems better to leave it, let people see what a world class walnut he is. Donkeys who spout that sort of thing tend to be painfully unaware so let him read what people think about him, maybe it'll spark something. Then again he seems like he's only running with a 5w bulb anyway.
Well, I recognised the Username... I recall a couple of posts about HEXUS reviewers' methods of testing GPUs, and him being unhappy that Hexites generally play in 1440p nowadays, while he's still in 1080. Most of his posts sound like a kid in his first paying job (PC tech, so presumably sweeping behind the Team Knowhow counter) with a lot of disposable income and no commitments.
As for the end of any bells, that's not for me to say... :)
Nah, I'll just walk away from my grossly underpaid job flipping burgers in the hydroengineering department, along with my wife, daughter, house, car, phone, television (oh, wait, no that's the wife's and she pays for that), and whatever else he deems too expensive...
That way, when his toilet floods with raw sewage at 4,000 litres a second and he can't get clean water, I'll be too busy sitting pretty on a new RTX 2080 Super GPU to give a hoot!! :lol:
And there it is. You can tell they're young because their only concept of outgoing is personal. No real bills, no contingency money, insurances, kids, dogs, budgies and badgers to look after. Life is bloody expensive!
I was planning on getting a 3700X on release day, had money put away and everything, then got stung for a ton of things all at once ha. Ah well, not the end of the world and all that. I do miss the days of just being able to drop stupid money on unneeded parts!
I don't think I'm particularly bothered, as long as the price is right ray tracing support is something I would easily forgo. But it really depends on the supported games as well.
Well if nVidia says it does then it doesn't.
Ray tracing seems pointless and just a waste of money to me, at this point at least. I'm much more interested in VR support. Sadly, AMD seems to be lacking in that area too... The 5700 XT cards don't have any VirtualLink ports, and they still don't support variable rate shading (for foveated rendering). I really hope AMD will catch up in this area soon.
Just ordered a 2070 Super from a 1060 but only because I have a G Sync monitor not because of RTX... If I didn't have a GSync monitor I would've gone for the 5700XT...