Read more.Provides common system specs and background to testing methodology.
Read more.Provides common system specs and background to testing methodology.
Shots fired. But I guess Nvidia would use any trick to get a few more fps, even at the cost of image quality. People tend to only look at the numbers after all. Sad but plausible.
Just a correction, the AOTS demo and results were from two RX 480's running in DirectX 12 explicit multi-adapter mode, not CrossfireX.
Funny, less snow actually looks better. The game is a mediocre RTS as it is, soo meh.
Seems unlikely that this is a deliberate trick by nvidia to improve AotS performance. Far more likely is that the driver support for the particular shader calls is incomplete or incorrect on the 1080, given it's a brand new card. These things happen all the time with drivers (I had to disable one particular effect on neverwinter nights back in the day because the ATI implementation was buggy and crashed the game). Whether nvidia ever bother to "fix" the "bug" is another matter entirely, of course.
Since when was it a shocker to find 2 mid-range cards that outperform one large card?...at least when scaling is working correctly.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Was that the shiny water bug? We still have a sticky on our PW forums for that
But yes, he clarified he wasn't saying nVidia were fudging the results, only that there might be a bug with the rendering - might be nVidia, might be the game devs, and only a small increase in performance gained as a result (theoretically). It was more in response to the comments saying AMD were the ones fiddling the settings.
Or the the developers didn't implement - or implemented differently - something for the Nvidia path (or even the Pascal sub-path) that did for the AMD path. Remember that with DX12 most of the architecture-specific tweaking is on the developer rather than the GPU vendor.
Again, we have seen these percentage differences in the past. Nothing new, people pay large premiums for the top cards.....hell look at Titans and 295x2....even Fury X.
And until we see UK pricing of RX480 and non-FE 1080 prices, your speculating on the difference anyway....with the recent £/$ changes and the rip-off Britain "extra fee", I expect 480 prices to be noticeably higher then many are expecting....in fact I wouldn't be surprised if they are £200+
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
What's the betting the developers started with an Nvidia card, so have accidentally ended up relying on an Nvidia bug to render the way they want it? I mean, the Nvida shots do look better don't they, so I presume that is how they are supposed to look.
Edit: what do they look like on a 980 or similar previous gen card?
I think we can just got a step back and think about what has occurred. Nvidia has released a high end flagship card to show "the best it has" (we obviously know it's not a Halo product, look out for the 1080ti and the Titan 10) and AMD has released a mid range volume. Both offer marginal increase on PY, healthy increases in performance per watt and, hopefully, great performance per pound.
This generation has not set me on fire but it's not rubbish either. Lets wait and see the GTX1060/1060ti and how it stacks up against the RX 480 and how the RX 490X (and some kind of Fury 4?) stack up against the 1080/ti.
The big thing this gen is AMD is pushing that price point down nice and hard from the word go. Must be really good yield?
Steam - ReapedYou - Feel free to add me!!
There are currently 1 users browsing this thread. (0 members and 1 guests)