Read more.DirectX Raytracing, revealed at GDC 2018, could make 3D games even more spectacular.
Read more.DirectX Raytracing, revealed at GDC 2018, could make 3D games even more spectacular.
Expanding DirectX 12: Microsoft Announces DirectX Raytracing:
https://www.anandtech.com/show/12547...ctx-raytracing
A good article delving deeper into things.
For today’s reveal, NVIDIA is simultaneously announcing that they will support hardware acceleration of DXR through their new RTX Technology. RTX in turn combines previously-unannounced Volta architecture ray tracing features with optimized software routines to provide a complete DXR backend, while pre-Volta cards will use the DXR shader-based fallback option. Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DXR acceleration. The tone of AMD’s announcement makes me think that they will have very limited hardware acceleration relative to NVIDIA, but we’ll have to wait and see just what AMD unveils once their drivers are available.
Though ultimately, the idea of hardware acceleration may be a (relatively) short-lived one. Since the introduction of DirectX 12, Microsoft’s long-term vision – and indeed the GPU industry’s overall vision – has been for GPUs to become increasingly general-purpose, with successive generations of GPUs moving farther and farther in this direction. As a result there is talk of GPUs doing away with fixed-function units entirely, and while this kind of thinking has admittedly burnt vendors before (Intel Larrabee), it’s not unfounded. Greater programmability will make it even easier to mix rasterization and ray tracing, and farther in the future still it could lay the groundwork for pure ray tracing in games.
Unsurprisingly then, the actual DXR commands for DX12 are very much designed for a highly programmable GPU. While I won’t get into programming minutiae better served by Microsoft’s dev blog, Microsoft’s eye is solidly on the future. DXR will not introduce any new execution engines in the DX12 model – so the primary two engines remain the graphics (3D) and compute engines – and indeed Microsoft is treating DXR as a compute task, meaning it can be run on top of either engine. Meanwhile DXR will introduce multiple new shader types to handle ray processing, including ray-generation, closest-hit, any-hit, and miss shaders. Finally, the 3D world itself will be described using what Microsoft is terming the acceleration structure, which is a full 3D environment that has been optimized for GPU traversal.Nvidia RTX:Though even with the roughly one year head start that Microsoft’s closest developers have received, my impression from all of this that DXR is still a very long-term project. Perhaps even more so than DirectX 12. While DX12 was a new API for existing hardware functions, DXR is closer to a traditional DirectX release in that it’s a new API (or rather new DX12 commands) that go best with new hardware. And as there’s essentially 0 consumer hardware on the market right now that offers hardware DXR acceleration, that means DXR really is starting from the beginning.
https://www.anandtech.com/show/12546...gpus-and-later
Did MS and NV screwed AMD on this one? Seems that nvidia was prepping this with ms, and designing their next-gen Volta GPUs with that in mind. And it looks like AMD was left out.
The more you live, less you die. More you play, more you die. Isn't it great.
People have been claiming that real-time ray tracing is just one generation of GPUs away for decades (as in at least 20 years), I'll believe this when products hit the market
Article updated with the following:
"AMD is announcing Radeon ProRender support for real-time GPU acceleration of ray tracing techniques mixed with traditional rasterization based rendering. This new process fuses the speed of rasterization with the physically-based realism that users of Radeon ProRender expect for their workflows."
CAT-THE-FIFTH (20-03-2018),Corky34 (20-03-2018)
No, just AMD spending too much time/R&D on console crap that makes them nothing (atari, msft, sony etc) and APU (new chip only $169...ROFLMAO, while Intel's 8809G coming probably above $350 with 4GB HBM! - AMD should have made it), while NV is spending the bulk of their income (from high end stuff - IE, 1080/1070 launched for a year before adding anything else) on enhancing PC stuff which is smart because it's their CORE products. You should only spend OUTSIDE your core tech when you're making LOTS of money. AMD does then and they get killed.
As Dirk Meyer said in 2011, you need a KING FIRST, then the crap to add more to your bottom line with existing tech. They fired him for it...LOL. Now 5yrs later they try to make a king cpu...ROFL. I'll say it again, management at AMD is ridiculously dumb. They need to cater to RICH people first, like NVDA/INTC. Check out both stock prices, NET income, etc. AMD should have listened to the DARK MAYOR (for those who don't get that, it's Dirk ). I miss Dirk & Jerry. They both believed in KING chips then the crap 2nd. The APU/custom socs make peanut margins and should only be used after you've exausted all the 50%+ margin stuff (1080/1070, pro cards etc, or in AMD's case Zen should have come 5yrs ago).
I disagree, AMD getting into consoles was never about making big profits it was about gaining market share, it's the same reason they're selling APUs for $169 and more cores for a lower price than Intel's server CPUs, at this point in time it would be financial suicide to try fighting in high risk markets, it's why (imo) they've not made a similar chip to the 8809G as it would cost a lot of money and has a low chance of return.
Taking risks is something you can do when it doesn't matter if it goes wrong, gambling is not something you can do when you're the underdog and like it or not high end stuff and niche products are not, and will never be, a CORE product, IMO AMD have targeted their limited resources in a way that gives them the highest chance of returns.
Well, you are both right in some way.
The core product is not high-end, as the quantities sold are much smaller than quantities of the middle and low end.
BUT thanks to having best high-end cards, you sell much more of your middle and low-end products.
The logic goes like this:
Buyer - I want a fastest graphics card.
Seller - It's Nvidia Titan XYZ.
Buyer - Oh, its 1000USD/Pounds/Whatever. Too much. What else do they have cheaper?
Seller - Hmm, GTX 1060 for 300$?(as used to be).
Buyer - Cool, I'll take it.So cheap for the fastest brand.
The uninformed buyer goes like that. And there are way more uninformed buyers than informed ones.
The more you live, less you die. More you play, more you die. Isn't it great.
I agree to an extent but i didn't think we were only talking about GPUs, nobodyspecial mentioned consoles and APUs, if we were talking about only GPUs then I'd agree but then again IMO AMD are not even on most consumers radar when it comes to graphics cards, like i said they need to, and have, targeted their limited resources where it counts the most and currently that doesn't include the uninformed buyer of graphics cards, heck it probably doesn't even include the informed buyer of graphics cards.
IMO they're targeting informed buyers in the server, OEM, and consumer markets who know they're getting more for less as not only does it cost less to market your product to those people but it also increases your chance of gaining market share and exposing your product to the uninformed buyer.
Last edited by Corky34; 20-03-2018 at 02:25 PM.
"Real Time Raytracing" is a tricky one. I'd probably be able to knock up a raytracer on hardware several generations old, you'd get it real time if you reduce the number of rays, the size of the scene, complexity of materials and the number of reflections/refractions/shadows you compute.
the problem will always be that peoples expectation of real-time raytracing will always be a bit higher than what is possible in current hardware.
I'm fairly certain (though I'd have to dig it out) that I've seen a raytracer/raycaster demo for the Amiga.
It's not bad, but the demo video is very obviously utilizing Raytracing. It has that characteristic raytracing 'graininess' on everything, especially reflections.
They should really improve the quality of raytracing before they attempt to bring it into the realtime world.
Just in case you missed the bulletin .. AMD did make it. Or at least, they made the GPU section of it, and Intel will have paid them for it. Either way, a chunk of whatever the 8809G costs wings its way to AMD. Every 8809G sold is a win for AMD. Don't knock it.
It doesn't take more than about 5 minutes with google to see that OEMs are still reticent about making good AMD-based devices, no matter how good the AMD tech is (spoiler: the AMD tech is good). The Kaby-G devices are a way for AMD to get penetration for their IP into more devices. Market economics and trends are something AMD can't really control - Intel won "hearts and minds" years before AMD were making good x86 CPUs, and Intel were way ahead of the curve in terms of advertising the chips inside a computer, which basically gives them a license to print money now.
As to the rest of your post .. we've seen this before. You say AMD need to target the high end, then when they do you complain they they're too expensive and need to compete on price. Get your story straight. AMD are doing what they can with lower margins and a much smaller public image than the companies they're competing against. That they're competitive at all is frankly flipping amazing.
Now that we finally have real time raytracing, all game developers need to do is to figure out that colors other than brown exist, and start writing interesting stories and characters instead of checking off items on a list.
It looks like AMD is including support for raytracing into Unity:
https://blogs.unity3d.com/2018/03/29...?sf185783339=1
Revolutionizing render times and workflows for realistic light effects has been one of the dominant themes at GDC 2018. The announcement of AMD’s Radeon Rays integration in Unity’s GPU Progressive Lightmapper is particularly exciting to game developers looking to boost the visual fidelity of their games assisted by an interactive baking workflow.With the new GPU-based progressive lightmapper, Unity users can achieve up to 10x faster bakes on a Radeon Vega in their system.Edit!!The Real-time Ray Tracing with GPU Progressive Lightmapper is expected to be released later this year.
Its apparently not for use in gaming,but more for rendering.
Last edited by CAT-THE-FIFTH; 29-03-2018 at 10:21 PM.
There are currently 1 users browsing this thread. (0 members and 1 guests)