I've not done an in depth look at DX12 benchmarks in AotS but if that image you posted is close(ish) to correct wouldn't the RX 480 be closer to GTX 960 territory?
If two RX 480's get around 62FPS then is it safe to assume that a single card would be around the 30FPS mark, if so isn't that closer to GTX 960/70 territory than 970/80.
Yes,funny how I was never aware of that it seems but AMD would be smoking weed if they would ever think that a dual RX480 card would be that viable.
It would only be viable as a vr card - for normal gaming it would not be as viable.
Plenty of engines don't support XFire or sli - you might want to read that at article in detail. Probably because consoles only use a single GPU.
Lots of complaints on forums of XFire and sli support being crap and these are from people who ponied up the cash. Heck even one of my workmates runs dual water-cooled r9 290x cards and he feels the same.
It would be a disaster of a card as soon as AMD cannot provide day one XFire profiles the card will be an r9 390x at most.
The moment it hits an engine with no support,the same problem.
Even the Pro Duo was more targetted toward vr game devs with its partial pro support.
Even though DX12 will be far better suited towards multi GPU it will take time for support to be added and by that time Vega or its successor will be probably available.
It would be better IMHO for AMD to concentrate on the single cards and making support as perfect as possible,and try and honestly work with a few more AAA devs. Nvidia having its name splashed on so many games is not helping AMD. The TR franchise went from good pr with AMD to making them look silly. NV pr just mocked them.
Last edited by CAT-THE-FIFTH; 01-06-2016 at 10:09 PM.
I think the point could be to show how wide the gulf in price between AMD and Nvidia is. We have a £165 pound card offering 65-70% the performance of a £650 card?
Unless we know the XFire scaling in the benchmark it could be 100% scaling or 50% - the joys of multicard setups.
What CAT said. If dual-gpu scaling in modern engines is poor (note the graph says the RX480 is only hitting 51% GPU utilisation) a single RX480 could easily be getting 40fps+. It looks like AotS is well programmed for using multiple GPUs efficiently though - low utilisation across multiple GPUs returning overall higher framerates is pretty impressive.
At least we only have to put up with a few more weeks of speculation and rumors, do we know if the RX480 is going to be the top end of Polaris 10?
Well the CPU used was a Core i7 4770 non-K in an H87 motherboard supposedly.
Should be powerful enough on the CPU front then. x8/x8 lanes should also be enough considering the move engines. You'd have thought that if they were being held back by something else they'd want to get around it because with the 1080 that heavily loaded it wouldn't increase much, while the AMD setup should stand to gain even more fps. Very mysterious - more so that they would highlight low GPU utilisation as a point on the slide - if anyone at Hexus could ask what the point of that utilisation figure is I'd be very grateful, perhaps we're missing something fundamental.
I'm sure if you ask Asus nicely they may oblidge...
Link of awesome
Steam - ReapedYou - Feel free to add me!!
This article on Fudzilla does some speculation on the GPU utilisation thing.
So at this point the differences can be explained by that. AMD is using AOTS since it favours them anyway and async compute is deactivated on Nvidia cards.As many have noted, Ashes of the Singularity is a DirectX 12 strategy game that uses procedural generation for rendering textures, along with dynamic game character unit composition depending on the map situation. These rendering features will prevent any two running instances of the game from ever being identical.
I think it is more interesting that the Doom dev said Vulkan and DX12 were somewhat inspired by Mantle and said you didn't need a $700 card to run the game well - and this was after Nvidia showed of the GTX1080 running Doom under Vulkan at the GTX1080 launch. Hopefully won't get Nvidia being annoyed with them,LOL.
Anyway,I watched the video again - the utilisation figures is probably because he is trying to say the GTX1080 is almost at 100% usage and the XFire solution is being barely taxed and it actually has more performance in the tank.
Not that I care much for XFire or SLI solutions anyway.
There are currently 1 users browsing this thread. (0 members and 1 guests)