Yes, AMDs own article....
https://community.amd.com/thread/180474
As for fermi, it is getting full dx12, just not every feature in hardware (obviously). Much the same as most cards over a certain age.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
extremetech ran an article regarding DX12 feature levels - Fermi came in at the bottom with a question mark as to whether it could actually do DX12 since it had resource binding tier 1 - the 2010 update exposed pretty much everything the core can do , in much the same way the original r100 radeon had pixel shader 1.0 , but DX8 needed PS 1.1 ; the pixel shader was fully exposed just not quite at the needed level.
http://international.download.nvidia...ational.hf.exe
And 3 (or 4) in 27 days tells me they are spending a lot of time knocking problems on the head and releasing fixes faster. Lets not forget their was a massive bug in the Windows 10 TP that was only fixed in RTM that caused BSOD with SLI, which meant nVidia were precluded from doing any testing at all with SLI until RTM code became available.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Which it was. And Nvidia still tanked. How hard is that for you to understand?
Proof of AMD's "sponsorship"?In a Alpha release of a game, that's been sponsored by AMD since its inception, and that possibly contains a bug.
Heck even Dan Baker, co-founder of Oxide Games said "Some optimizations that the drivers are doing in DX11 just aren’t working in DX12 yet."
Proof of 30+ months vs 6-12+ months?Except they didn't did they, AMD had 30+ months, Nvidia had 6-12+ months.
No, the "fake" image (are you serious or just incapable of clicking on links?) was posted as part of the proof that Nvidia was lying about MSAA.The fake image you posted has nothing to do with MSAA, in case your frenzied state has caused you to lose your mind let me reminded you, I questioned your claim that DX12 is based on Mantle, you posted an image from some photobucket account in an attempt to backup those claims, that is the image that I'm saying is fake, that you failed to provide a source for.
http://arstechnica.co.uk/gaming/2015...nt-for-nvidia/
arstechnica: To help things along, the benchmark was run in three different resolutions: 1080p, 1440p, and 2160p (4K). All were run at the same "high" preset with MSAA disabled. That gives us a total of 24 separate benchmark runs, each with multiple data points to looks at.
Yeah except when you disable MSAA you get the results that ars got. Need a reminder?But it can be argued that without the bug we could see increased performance, that's why ALL forms of Anti Aliasing should be disabled, you remove the possibility of any supposed bug causing biased test results.
There's a 290X tying a 980 Ti with MSAA off. Do you have any excuses left that I haven't yet destroyed?
Both those times seem rather excessive to me, so I don't think it matters.
Performance tuning for any software I have been involved in my three decades in software development happens in the last couple of months of development. Up until that point, stuff is in flux with feature additions and re-factoring is happening. Early optimisation just makes stuff harder to re-factor and is probably pointless as there will be interactions with features that aren't complete yet. You don't take you eye completely off the performance ball as you want to make sure there aren't problems with your basic software architecture, but it shouldn't be a priority other than making sure you have benchmarking tools in place.
So, I would say anyone who was there for the last 3 months was there in terms of performance.
Bug reporting, well that is different. You always hope you don't have to chase up companies with "Why does it do xxxx when I asked it to do yyyy" questions, and I get the impression there is a lot of that with graphics, but that isn't about performance.
One thing that doesn't seem to have been discussed: traditionally the bulk of graphics cards sales are at the low end. I don't think DX12 is relevant down there, so sadly I doubt that is a magic bullet for AMD unless it gets them some halo effect from the high end cards looking better.
No it wasn't, MSAA was disabled, not all ALL forms of Anti Aliasing.
Already provided you with that but you've conveniently ignored that.
If that post isn't enough proof for you just go to the Ashes of singularity web page and scroll down.
Oxide Games was formed in October 2013, when they also announced the following...
6-12+ months being May 2015 when AMD abandoned development of the API that Oxide’s Nitrous engine was based on, after which they started porting it to DX12.As the sole industry provider of technologies powering both PCs and major next-generation consoles, AMD is a natural fit for Oxide’s Nitrous engine, an evolutionary leap in PC and console gaming development,” said Ritche Corpus, director of ISV gaming and alliances, AMD. “Oxide’s Nitrous engine supports thousands of high-detail animated models on screen simultaneously. Nitrous makes tomorrow’s designs come to life on today’s top hardware like AMD’s new AMD Radeon™ R9 Series GPUs, unrivaled APUs and powerful CPUs.”
What part of "The fake image you posted has nothing to do with MSAA" do you not understand?
As I have now said on more than one occasion, I'm questioned your claim that DX12 is based on Mantle, that you posted an image from some photobucket account in an attempt to backup those claims, that is the image that I'm saying is fake, that you failed to provide a source for, NOT the MSAA image, but this fake image that you posted.
^^That is the fake image you posted in an attempt to backup your claims that DX12 is a rebadged Mantle.
No I don't, but you certainly need reminding that Dan Baker, co-founder of Oxide Games said "Some optimizations that the drivers are doing in DX11 just aren’t working in DX12 yet.".
And that even though Ars disabled MSAA, they did not (afaik) disable ALL forms of Anti Aliasing, without disabling ALL forms of Anti Aliasing you're introducing a performance degradation that MSAA was specifically intended to alleviate, MSAA can (afaik) provide similar quality at higher performance, or better quality for the same performance, thus if it's not working correctly you take a performance hit to get the same quality when using other forms of AA.
I've already explained why those results may not be valid.
Look it's obvious you're happy to believe that AMD and everything they do is without fault and that you and the other less that 20% of people are the only people that truly understand the GPU market, you can spread misinformation, claim that someone is a fanboi even though they support what AMD are doing, accuse people of being faux tech "enthusiasts", and sheeple.
I can, and have proved you wrong on many of the "point" you've made, but it's obvious to anyone reading this thread that you lack any and all objectivity, no matter how many times people disprove what you've said that's not going to change, feel free to spout more nonsense as you'll get no more reply's from me, I'm bored of listening the rantings of a mad man trying to validate his distorted view of the world.
Yea apologies for that, it was more a response to a mad man that claimed they both had the same time, when it's obvious they didn't.
But I agree 6-12+ months should have been enough, maybe not as good as 30+ months, but still enough.
If anything I think it is more relevant at the lower end as people don't tend to have as good CPUs,and don't overclock. It was one of the areas Mantle showed an improvement,like here:
http://media.bestofmicro.com/A/Z/442.../Thief-Low.png
Thats with a low end graphics card too - people will tend to upgrade their cards more often throughout their years,than their CPU,meaning DX12 should help - plus remember we have a process node shrink next year,meaning the mid-range should have a nice jump in performance too.
If anything,if some on-line games start getting DX12,I can see it being VERY helpful as they tend to be massively bottlenecked by a single thread in many cases.
Last edited by CAT-THE-FIFTH; 27-08-2015 at 02:02 PM.
I think you are right that it will help with performance Cat, which can only be a good thing, but I still don't think that will translate to improved sales. It seems to be all about how many GB of ram a card has for the money.
Corky34 I also spanked him in a discussion regarding AMD vs Nvidia, he just chose to ignore my valid points and legitimate reasoning regardless of how many times I asked him to counter because he did not have a reasonable comeback.
Sadly Fanboi's like him cannot see anything objectively, I don't hate AMD at all but they are getting beat for a reason and not the stupid fanboi reason's Jimbo likes to come up with like Marketing and 80% of the GPU buying public being idiots, laughable really.
AMD very well might have a performance advantage with DX12, but it's far too early to tell yet, one benchmark is nowhere near enough data to conclude on, we need to wait until there are at least 5 released games to see who has the advantage and by then there will probably be a new round of Graphics cards released, if history is anything to go by, on almost all DX releases the first wave of cards were generally too slow by the time developers took full advantage of the API and developed properly on it
Last edited by GrimMachine; 27-08-2015 at 09:27 PM.
Objectively, I don't think that the market share really reflects the products that has been released though. The HD2000 series was poor, and the HD3000 was too little and too late, but since the HD4000 series, it hasn't been one-sided. From a product perspective, there were rounds where AMD was leading in one key aspect or another (if not absolute performance, cost/performance) and many case where it is too close to call especially once you take cost into account. Yet the market share were never remotely close.
And let's look at the CPUs. AMD's greatest success from a product perspective is probably the Athlon series. Faster and cheaper than the competition, and held onto the crown for a pretty long time by CPU standards. And yet their market share made little progress.
It does make nVidia's achievement impressive. Back when the market for graphic cards was a lot more crowded 3DFX had the brand behind them, yet they not only lost the crown, but they didn't even survive. In contrast, Matrox, S3, PowerVR managed to survive by refocussing into another market . Speaking of Matrox if not for their reputation before 3D accelerators became the thing, I am not sure if they would have survived to this day. Anyway, nVidia really managed to turn the market upside down, became the new king ("The way it is meant to be played"), and I have to give them credit fo that. AMD, to this date, has been unable to achieve that sort of impact despite having some very good products at times. Not sure what they are doing wrong, but it is clearly not the just about the product.
Food for thought. I wonder how much of the market share is decided by us, enthusiasts, as opposed to markers like Dell, HP, Lenovo etc. If most discrete graphic cards are sold in pre-made systems, then the card with the most presence in the maker's site will probably have an edge. It isn't about the public "being idiots", but the public having better things to do than care about what we might.
And as to which card the maker decide to put on which system, I am pretty sure it is down to more than the benchmark we spend analysing. And by that, I am not implying anything sinister, just business and marketing.
You should have PM'd that bit - pointless and doesn't add to the discussion.
Now here I'm going to agree. Heck, as has been said above Microsoft didn't exactly drop DX12 "professionally" and it's (by all accounts) still pretty darned ropey.
Good post.
You've pretty much nailed it - for all our loudness here, we're only a tiny part of the "buying public" and the overwhelming majority will just take what the manufacturers give them. And lets not kid ourselves AMD have done pretty well with their console offerings and also the very low end APU stuff. Problem is that NVidia have pretty much solid mindshare on the higher end bundled offerings - take a trip to Dell, HP, Lenovo, etc and you'll see that their high end desktops and laptops (those with discrete GPUs) are pretty much exclusively an Intel+NVidia combo.
Now as someone who's currently running an AMD+AMD setup (cpu+gpu) I'd love to mislead myself into thinking "ooh, big conspiracy", after all, it's not as if Intel doesn't have "form" in that area... Nope, AMD's problems are simply "too late" and that Intel/NVidia are better at marketing than they are. Look at the high-end CPU's - Zen isn't due until next year by which time Intel will undoubted match it. Likewise on GPU's NVidia seems to have a "sausage machine" of new products, and I'm sorry to say that there's a group of (vocal!) folks out there who'll always pick the newest, serial upgraders as it were.
AMD products are pretty price competitive (always have been) but that's no help if they're seen as inefficient, hard to live with (noisy), low end, poorly supported (drivers and apps) or based on "last years tech". As seems usual with big companies these days the "bean counters" have utterly failed to appreciate that their R&D department is an "asset" not an "overhead". They've culled the "geeks" and now the cupboard is bare.
Some good points.
I agree that the marketshare doesn't reflect the products as like you said AMD have competed closely with Nvidia throughout and have had better products at times. I also do believe marketing has some impact (my short statement above has history with Jimbo as he claims it's all Nvidia brainwashing marketing and all Nvidia buyers are idiots), clearly Nvidia is the bigger company with a far larger marketing budget which will definately contribute especially with people buying pre built PC's through big manufacturers like Dell etc.
But, Nvidia has been building a very strong consumer base through solid products and driver support and in the last few years have grown the gap due to AMD not releasing as many and as good GPU's. Some of the additional features like Shaddow play help with this choice, if it's 50/50 on performance and one product offers more extras than another, then most consumers will buy that product, regardless how much they use those said features.
Going back to the release of the 290 and 290x, the reference boards ran too hot and AMD blocked the release of 3rd party custom boards for the 1st 6 months of release, this impacted sales as people waited (myself included) for the custom boards to arrive Nvidia dropeed the price of the 780ti and 780, making it an easy decision to buy one of them instead. Greed got the better of them there, nobody wanted to buy a 290x running at 94°C! That was 2 years ago and AMD market share wasn't as low as it is now, since then they released nothing to compete against the 9xx series until the Fury X, 9 months later and still not as fast or priced as competitively against Nvidia. They have clearly lost ground over the last 2 years, ground they couldn't afford to lose given they were down on marketshare already. It's an ever changing environment in which both companies need to be relentless in their releases and feature set, sadly AMD has dropped over the last couple of years in that respect.
Then going into driver updates, there are many opinions on this one with reports of both AMD and Nvidia drivers having issues (and rightly so), however my experience (3 x PC's, 1 with 2 x 970 in SLI driving 3 x 1080p in surround, 1 x 680 and 1 x AMD 7970) AMD driver releases are slow, especially for newer titles. The wait for an optimised driver with AMD can run for up to 2 months, Nvidia generally get a driver out within a week. Now, sometimes it doesn't make any difference as the game runs well enough on my 7970 to not be an issue, however I have found several high profile releases where AMD have been incredibly slow out of the blocks to release a decent driver. The cynic's will tell you that's Nvidia's fault as they are nerfing AMD performance with things like Gameworks (even though Gameworks extras can be turned off so I don't buy that argument) but if that were true, surely a company like AMD they would work even harder to release a decent driver in a timely manner to compete against performance issues found. They appear to not bother most of the time, generally releasing drivers when they see fit regardless of performance in new game releases. There is no smoke without fire, AMD have a bad rep with drivers for a reason, again, it's not a conspiracy that AMD fanboi's try to claim, it's a lot of peoples experiences and those experiences define what they buy next. Nvidia are no saint when it comes to drivers either, we have seen some bad releases over the years, but the speed of release of Nvidia drivers generally means that, for me anyway, the issues are short lived especially when it comes to game specific performance.
I would hate to see AMD dissapear as competition and choice in the market place is a good thing and hopefully with DX12 AMD can produce better performance than we have seen with DX11 and ultimately allow them to claw back more marketshare in the future.
I always buy what I consider to be the best product for my budget and compare price, performance, power consumption (which generally shows how hot it will run or how noisy it is), overclocking headroom, additional feature I will use and overall system compatibility (size of card in desginated case etc) and if AMD's product or any other are better than the competition that's the one I will buy, I don't need to have an affinity to any product or brand, whats the point of that all people are doing is restricting there buying power when they do that and that is ultimately not cost effective or sensible when building PC's.
To say AMD GPUs are inferior is a little misleading (imho) as how good a GPU is depends on the type of work it's best suited for and what you're asking it to do, its generally accepted that presently GCN is better at handling parallel workloads and Maxwell better with serial workloads, if the parallel nature of DX12 was introduced in DX11 we would have a very different situation, it's why (imo) AMD was forced to develop an API that would leverage the parallel nature of their GPUs.
There are currently 1 users browsing this thread. (0 members and 1 guests)