Read more.DirectX 12 Multiadapter functionality gets its first public outing in Ashes of the Singularity.
Read more.DirectX 12 Multiadapter functionality gets its first public outing in Ashes of the Singularity.
This is the best thing about DX12 I think. Being able to get that little bit of extra juice from the integrated graphics is great. Intel graphics aren't great on their own but could work as a companion GPU.
ohhhh soon I can mix intel HD with a slow GT 730!! DDR3 1600 mhz + DDR3 1800mhz I wonder how many FPS will shoot.
Would have been interesting to see a 2012 GPU plus a 2015 GPU in the test. I assume there would be some benefit. Many people upgrade on a cycle these days. The older graphics card is still fine in 2D apps, but needs upgrading for more modern games. Being able to still get some benefit from the older card would be a bonus.
I've a 4670k and just replaced a radeon 6950 with a R9 390. Would be quite something to be able to use all 3 GPUs.
Edit: Looks like 6950 isn't supported, so I presume there will be a list of cards that are when DX12 shakes out.
Last edited by iranu; 27-10-2015 at 01:24 PM.
"Reality is what it is, not what you want it to be." Frank Zappa. ----------- "The invisible and the non-existent look very much alike." Huang Po.----------- "A drowsy line of wasted time bathes my open mind", - Ride.
The 6950 doesn't support DX12, thus obviously not DX12 Multiadapter either. Other than that, I agree that a look into combinations like that would be great, and particularly useful to people looking to upgrade at some point.
I hope to see more of this in the future. Could be game changing for people on a budget or switching from one GPU maker to another.
From an aesthetic (or pedantic) point of view, I wonder if many people would feel comfortable mixing GeForce and AMD cards in a build. That's if AMD cards are still around when DX12 really gets going...
Depending on scaling vs CFX and SLI and the all important "does it introduce micro-stutter or anomalies?"....this could be huge. Especially if it allowed access to all the vendor-specific features still (Yes nVidia, I am looking at your silly Physx lock-out).
The numbers aren't incredible but still better then I expected. How much work needs to be done by the game dev to support it though I wonder?
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Considering Radeon 7000 series cards and onwards are DX12 compatible there may be one or two AMD cards when DX12 really gets going.
Very interesting piece of technology and a very informative article from Anandtech.
Very interesting for those amongst us with APU's! Could be potentially larger increases to see if the APU is paired with a very mid tier GPU.
Looking forward to more on this!
Steam - ReapedYou - Feel free to add me!!
If you guys ever get your hands on this, I would love to know if you can run physx, vsync, freesync, and tressFX, or will these still get blocked by the vendor drivers.
Also what advantages will come of using a Freesync/Vsync monitor through an Nvidia/AMD card to show what gpu will be best for the primary when using either monitor.
Then there's the fact that we all want to play older Directx 9-11 games and OpenGL games, will these games run or break down on a system with Multi-vendor GPU's running.
And just a maybe but what about two 7990's and two GTX 690's in one system, as a insanity test.
As someone with a recent laptop APU I say be careful about getting your hopes up. My A8-5550m GPU doesn't support DX12 (its not GCN) and the laptop is only just over a year old. What makes it even more frustrating is the laptops dGPU does so it could have made a big difference to performance.
Now I'm waiting to see when NVIDIA will put code in their drivers to disable any cooperation with AMD cards.
The Hand (27-10-2015)
According to the rest of the Anandtech article, all of it, including their own implementation of alternate frame rendering, which is what this uses (and why they are only using fairly similar performance cards to test: alternate frame rendering wouldn't make sense for something like an IGP + powerful discrete card). DX12 simply includes the hooks to send different types of processing to different GPUs: how you actually process them (and recombine the results) is up to the game devs and has to be coded from scratch (although presumably it would be possible to include some of that work in the game engine).
of course, once a few games have included this kind of stuff there'll be plenty of examples out there of how to do it, so over the next couple of years it should get easier and easier for devs to use whatever GPUs they happen to find in your system. A couple of generic DX12 physics libraries that can offload calcs to any secondary GPU could spell the end of PhysX....
Could somebody clever clarify for me that what they have found is that a GTX 980 paired with an R9 Fury (using DX12) performs better than 2 x GTX 980's SLI'd and 2 x R9 Fury's in Crossfire?
There are currently 1 users browsing this thread. (0 members and 1 guests)