Read more.And it has shown off a re-engineered Quake II RTX title with raytracing on Vulkan.
Read more.And it has shown off a re-engineered Quake II RTX title with raytracing on Vulkan.
Holy..... they must have been sitting on this for ages in order to respond so quickly to the Vega56 demo.
If I'd bought an RTX card I'd be utterly livid.
Annnnd Nvidia, if you're reading this..... this is the kind of business practice that has me buying AMD.
outwar6010 (19-03-2019)
Looking around the web it seems people are being very quick to jump on some good old NVidia bashing, but taking a step back from all of that... to me it looks as if AMD's developments and this are actually good news for everyone. Those without RTX cards will now be able to play around with some basic ray tracing soon, and those with RTX cards are going to get better performance and/or visuals and a higher adoption within games going forward, I can also see ray tracing in consoles now and I dare say current RTX owners may be able to keep up with the next round of consoles for their first couple of years, which would kinda make people be able to justify the price of their early investment.
Fingers crossed, it could simply be a positive situation for all (for once).
That absolutely depends on whether games bother supporting the hardware features of Turing and whether the RT cores are useful for the software based stuff. If I was a developer, RT was now available to almost everyone and it required an extra effort to hardware accelerate it on the RTX cards which represents <1% of your audience, I'd just not going to bother, especially as the high end cards will have enough chooch to cope with the software based stuff quite happily.
I think the Nvidia bashing is absolutely warranted. They insisted specialised, expensive hardware was required and not only was this not true, they'd obviously been working on their own software solution whilst telling us we needed to spend a fortune on their propriety hardware. I wonder how carefully they crafted their speeches at the release to avoid being sued.
But you do need the hardware for playable performance with multiple complex ray traced effects, the 1080ti is only pulling 18 fps on metro exodus in 1440p and below 10fps in port royal benchmark, and that is being shared from NVidia themselves so is probably the best case scenario.
This news does not mean pascal owners will be playing anything remotely near to what RTX cards are capable of and that RTX was a waste of time, on the contrary, it is going to show how bad the performance is on older cards but may allow AA and indy games to make use of it. Don't forget the AMD implementation is simple and light on hardware also, the Vega56 pulling 30fps at 4k could potentially mean 60+ fps for RTX cards, and even without RT core support, turing will still better that on shader cores so it is still only good news all round.
I'm pretty sure that raytracing will run like a dog on non-RTX cards. Otherwise the great unwashed masses might come to think that "dedicated raytracing hardware" isn't necessary.
FWIW I don't believe this is nVidia's answer to the Neon Noir demo. If it were Jensen would have downplayed it and stated that RT on Pascal runs much more fluid than the demo.
I still play Q2 regularly... wonder if it will run well enough
Originally Posted by Advice Trinity by Knoxville
Last edited by Terbinator; 19-03-2019 at 02:13 PM.
Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.
Who cares? Bash them anyway, because reasons.
Them and Intel. Whoever is the market leader at the time. Always root for the underdog... until they start winning, and then slam them.
Been working on.... but did not yet have available, so the only solution you could get was their expensive hardware?
How is that even a problem, much less bash-worthy lies?
_______________________________________________________________________
Originally Posted by Mark Tyson
Surely the timing is not a coincidence?
As for the Metro argument, I see Metro Exodus as being today's Crysis. It is absolutely not representative of the gaming scene as a whole, it's useful only as an extreme example to support arguments such as the one you are making. It's an anomaly. An outlier.
And not on Steam so I'm not buying it *ducks*
/troll
Let "off back under my bridge" = 1
But seriously, yes we know you need the hardware for what's out there now but they have just shown it can be done not only without specialist hardware but also at a decent resolution and frame rate. Whether this will make it into games within the next year, no one can tell. I also find it hard to believe that Nvidia didn't know that this could be done apparently just as well, if not better on software but decided to persevere with hardware given the investment already made. They have form for this - see G-Sync where for years they insisted you needed specialist hardware in a desktop monitor but did it using software in laptops. THEN when Freesync evolved to the point where it was actually properly competitive (bear in mind the lack of proper validation and the variability in vital specs like LFC) they came out and said "oh yeh, you can do it in software now because we're so amazing".
The questions we have now therefore are as follows:
1) How much was this demo optimised to allow 4K@30FPS on a Vega56? That would be a tough call for any game without ray tracing
2) How will performance change with moving people, cars, etc? It was a very static demo.
3) Is this software orientated tech going to make it into a DirectX spec? Now THAT would be funny. Especially if a load of cards were found to support it retrospectively.
4) Did they use an AMD card for a reason? If so, does this mean the GTX series will be pretty poor at ray tracing? Is this where the compute origins of Vega will finally give it an advantage? As you say non RTX cards may be poor but that 4K@30FPS demo suggests they don't have to be.
5) Will NVidia use its market share and power to crush this like the oil companies who have suppressed all the perpetual motion machines and the zero point energy from within the quantum vacuum research? (Sorry, the troll escaped).
Well, you do know this is absolutely standard business practice, I assume? No company wants to obsolete it's own product especially if it is currently leading. Always remember, their objective, either long or short term, is maximising profit and unless there's a good strategic reason to release early, why do it?
And, of course, those wanting leading-edge products can assume it won't be leading edge for long.
This is why my buying decisions are always based on "Am I prepared to pay this price for these features, to get them now?
If yes, buy, and accept that it'll be cheaper and very possibly better in a few months, or a year. If not, sit on my wallet until the answer is yes.
I'm not, I'm not even going to fanboy on you either because I'm also not. I've upgraded from an R9 290 to an RTX 2080 because it was a good fit for gaming at my newer resolution of 2560 x 1440 with the settings cranked up to max. If AMD had been offering something equally as good with the sheer bliss of a cool and quiet GPU drawing less power I would have opted for them, they weren't.
I'm also happy that Ray Tracing is being opened up to the masses through software, whichever way that is spun, it's a good thing for everyone if it's mass adopted - dedicated hardware support or not.
Progress!
However disturbed at the "iloveyou" smiley (which actually reminds me of a virus back in the day), I should actually amend my rant. I meant to say "If I'd bought an RTX card for the ray tracing" and was really more meaning the stupidy expensive 2080Ti model but I was in a rush, ranting on the internet between patients and frankly made a hash of it.
If I was looking for your requirements, I'd have gone for the 2080 most likely. The difference was that Nvidia's business practices really, really annoyed me to the point of not wanting to give my money to them (although it has been pointed out to me quite correctly that if AMD were int he position to behave as such, they probably would) and also heat and power consumption just were non-issues for me. There are whole server racks smaller than my desktop, it has more fans than Intel and a rather oversized radiator because... I could. I also wanted the 4096 cores and HBM. I was actually attracted to the elegance of the design of the AMD card but I will say the bugger is hot, hungry and loud. And the noise is actually an issue sometimes. when you have to turn the sound up during gaming. The 2080 at the right price is a sound buy and if you don't want a small tornado cooling your GPU is really your onoly choice for a new card due to the age of the 10xx design. I totally get your choice.
What genuinely disappoints me is that Nvidia put all this R&D into creating something that is now probably dead. Not just for the consumers but they've failed to move things on in terms of outright chooch factor and AMD are really only just doing a refresh of Vega and aren't moving things on in the GPU.... "space" (I hate that term). So it kind of feels like we've stalled in progress and on reflection I think that's why I'm annoyed. That R&D time and money could have gone to something far more useful and pushing things on in a way that mattered. As it is, they struggled to convince me and felt like they were turning their CEO into a pseudo-rockstar and treating me like an idiot with demos and marketing jizz speak. Now it looks like all of that is for nothing and they've just wasted a load of money as well as R&D time.
If they'd come out with a conventional card which moved the game along properly in terms of FPS and so on, I'd be less annoyed. But they probably knew about this but made a decision to double down on hardware when they could have spent the time making a difference.
Now, if you'll excuse me, I have potatoes to slice. I genuinely hope to return to many people telling me how stupid I am because this is all rather downbeat.
This reminds me a little of Freesync vs g-sync.
"Oh, it turns out all of our cards can do Freesync, we just turned it off because we wanted to sell expensive screens. But you should thank us for finally enabling Freesync anyway."
They sold hardware based on needing that hardware for raytracing. Turns out you dont.
That hardware does make RT better though. It isnt the hardwares fault - but marketing it in the way they did was pretty deceptive.
The Vega56 demo makes it look to me like RT will work well on any card that is strong in compute. Which makes sense (and is probably stating the obvious to people who know more than me). So the RTX cards will likely be strong with Cryteks DX12 implementation, too.
The really interesting part for me is how Navi will handle it, because if the next gen consoles get ~vega56+ performance. Well, this 4k@30fps with raytracing seems almost targeted at them.
And that will be great news for all of us - ray tracing will be everywhere.
/Speculation..
Think your assumptions are way off the mark there. Nvidia want to sell the new cards so will make the older ones look as bad as they reasonably can. Not saying it will be a smooth 60fps at 4K but for the Exodus benchmark at least, I think 1080p30 shouldn't be a problem.
There are currently 1 users browsing this thread. (0 members and 1 guests)