Page 7 of 13 FirstFirst ... 45678910 ... LastLast
Results 97 to 112 of 196

Thread: AMD discrete GPU market share eroded to less than 20 per cent

  1. #97
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    First of all "having to rely on DX12"? DX12 is Mantle rebadged. If not for AMD you wouldn't have DX12, so this "relying" stuff starts off on the wrong foot.
    Firstly DX12 is not a rebadged Mantle, AMD donated the code for Mantle to the Kronos group, that's not to say the development of Mantle had nothing to do forcing Microsoft to pull their finger out their ass and start development on DX12, it's just to say that the two different API's are exactly that, different ways of addressing the same problem.

    Secondly the developers of the Oxide Nitrous engine, the engine that Ashes of the Singularity uses, those developers worked very closely with AMD on the development of Mantle for many years and visa versa, when it came to porting it over to DX12 most of the heavy lifting had already been done, on a AMD API and AMD cards, is it any surprise that AMD cards see big increases in performance on a benchmark that they've helped develop?

    Quote Originally Posted by Jimbo75 View Post
    Secondly, Nvidia optimised the DX11 path because they'd rather you were all stuck on backward technology. Problem is that just shows how awful their cards will be on DX12, with regression in performance, so it kinda backfired.
    How has it backfired? IIRC the Ashes of the Singularity benchmarks, ones that disabled AA because of the claimed bug, those benchmarks show that yes AMD saw massive gains when going from DX11 to DX12 but it didn't magically make them faster than the equivalent offering from Nvidia, at best the AMD cards were within a few percentage points of their competitor.

    I mean we are talking about a R9 390X being at most 10% faster than a GTX 980 when running a DX12 benchmark, but OVER 95% slower when running the same benchmark in DX11.

    Given the choice between a card that's £50 cheaper, that isn't very good with DX11, a card that's dependent on DX12 and game developers/engines to match the performance of the more expensive card, I know what card I would choose.

    Quote Originally Posted by Jimbo75 View Post
    There's a difference between couldn't and shouldn't. Maybe the best way to look at this is that AMD wrote *THE* driver. You know, Mantle? The thing that got rid of the need for hardware developer drivers? All the while Nvidia did what to progress the industry?
    You're conflating the two, Mantle is not a hardware driver it's an API, the two are very, very different, and having Mantle or any other API doesn't mean you don't still need hardware drivers, and developers writing and optimising the code for said drivers.

    Quote Originally Posted by Jimbo75 View Post
    GPUs are built for parallel processing, not serial. DX11 is based on 20 year old technology and is horribly single-threaded. DX12 is the first real major change in decades. Game developers are supposed to be developing games - hardware makers are not supposed to be trying to fix problems with an ancient API.
    GPUs are built however the manufacturer decides they should be built, be that parallel processing or serial, and those decisions are taken by looking into a crystal ball and then designing your hardware in a way that works best with how you see the future panning out, AMD gambled on a future ecosystem with heavy dependency on parallel processing, Nvidia gambled on serial processing, it turned out Nvidia's gamble paid off and AMD had to force the issue.

    I can't argue with DX11 being an old technology, DirectX became old technology when Microsoft decided to get into the console market over 15 years ago, but that's another story.

    Quote Originally Posted by Jimbo75 View Post
    That's all relative. A 980 Ti on DX11 is only as "quick" as a 290X in DX12.
    Are those results with AA off, or on ? Because with AA off we get a very different picture...
    Quote Originally Posted by PCPer
    Finally, this graph attempts to show which card has an advantage on each processor with each setting and with each API. We are using the R9 390X as the baseline score, meaning that any positive percentage result means the GTX 980 is faster while a negative score means the R9 390X is faster.
    Last edited by Corky34; 22-08-2015 at 02:43 PM.

  2. #98
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Firstly DX12 is not a rebadged Mantle, AMD donated the code for Mantle to the Kronos group, that's not to say the development of Mantle had nothing to do forcing Microsoft to pull their finger out their ass and start development on DX12, it's just to say that the two different API's are exactly that, different ways of addressing the same problem.
    You don't just suddenly develop an API like Microsoft supposedly did. An API that is to most intents and purposes, identical to Mantle? Come on, you're fooling nobody.

    Secondly the developers of the Oxide Nitrous engine, the engine that Ashes of the Singularity uses, those developers worked very closely with AMD on the development of Mantle for many years and visa versa, when it came to porting it over to DX12 most of the heavy lifting had already been done, on a AMD API and AMD cards, is it any surprise that AMD cards see big increases in performance on a benchmark that they've helped develop?
    From Oxide - http://www.oxidegames.com/2015/08/16...-of-a-new-api/

    Being fair to all the graphics vendors

    Often we get asked about fairness, that is, usually if in regards to treating Nvidia and AMD equally? Are we working closer with one vendor then another? The answer is that we have an open access policy. Our goal is to make our game run as fast as possible on everyone’s machine, regardless of what hardware our players have.

    To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.
    I mean we are talking about a R9 390X being at most 10% faster than a GTX 980 when running a DX12 benchmark, but OVER 95% slower when running the same benchmark in DX11.

    Given the choice between a card that's £50 cheaper, that isn't very good with DX11, a card that's dependent on DX12 and game developers/engines to match the performance of the more expensive card, I know what card I would choose.
    Except we already know that the 980 and 390X are basically tied in DX11 performance so that doesn't wash.



    That's over 20 odd games, not 1. With a single proper DX12 benchmark showing terrible performance, the onus is on Nvidia to prove that it's not their fault. Their response?

    Bleated and lied about an MSAA bug that doesn't exist.

    You're conflating the two, Mantle is not a hardware driver it's an API, the two are very, very different, and having Mantle or any other API doesn't mean you don't still need hardware drivers, and developers writing and optimising the code for said drivers.
    So what has Nvidia been doing with their DX12 drivers up till now? Shouldn't they be working on those instead of the outgoing DX11? They've had A YEAR and all they have to show for it is a performance regression.

    Are those results with AA off, or on ? Because with AA off we get a very different picture...

    Different how? The 390X wins still even though it has an even more massive DX11 gap to overcome? What about when the DX11 gap is lower, is AMD going to be a mile ahead?
    Last edited by Jimbo75; 22-08-2015 at 03:29 PM.

  3. #99
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    You don't just suddenly develop an API like Microsoft supposedly did. An API that is to most intents and purposes, identical to Mantle? Come on, you're fooling nobody.
    You do know Microsoft have over 10x the staff of AMD right?
    And you do know that DirectX is a proprietary API that only runs on Microsoft operating system, versus Vulkan (aka what's left of Mantle 1.0) that's open and portable across multiple platforms, heck if you don't believe me just read the Wiki or search Goolge, TBH saying that you think DX12 is derived from Mantle shows a fundamental lack of understanding on the subject.

    Quote Originally Posted by Jimbo75 View Post
    If you had done your research you would see that even though the source code has been available for over a year (apparently), that source code, until 6 months ago, was entirely geared towards AMD's Mantle, it would have been of little interest to anyone but AMD, the company themselves even say they've been working closely with AMD since their inception over two years ago.

    Quote Originally Posted by Jimbo75 View Post
    That's over 20 odd games, not 1. With a single proper DX12 benchmark showing terrible performance, the onus is on Nvidia to prove that it's not their fault. Their response?
    Yet we're not comparing over 20 odd games are we, we're comparing how two graphics cards perform in a single benchmark, like for like, and not an aggregate, if we had 20 different DX12 games then maybe we could draw a better conclusion, maybe if we had a greater pool of tests to draw from we would see a different outcome, but we don't, we can only base our conclusion on this one benchmark, maybe it favors AMD cards, maybe it favors Nvidia cards, we wont know until we have more than a single data point.

    Haven't we already establish that as best WCCFTech is an unreliable source, at worst they publish click bait articles.
    Either way if one party claims there's a bug the only way not to get into a he said she said situation is to remove that supposed bug from you test, that way you place both parties on an equal footing and avoid the complaints that the test could be biased.

    Quote Originally Posted by Jimbo75 View Post
    So what has Nvidia been doing with their DX12 drivers up till now? Shouldn't they be working on those instead of the outgoing DX11? They've had A YEAR and all they have to show for it is a performance regression.
    And you're going to base your opinion on a single benchmark? Maybe when you have more than a single data point you can start to draw some type of conclusion, until then making a judgment on how a driver performs seems a little fanboy'ish.

    Quote Originally Posted by Jimbo75 View Post
    Different how? The 390X wins still even though it has an even more massive DX11 gap to overcome? What about when the DX11 gap is lower, is AMD going to be a mile ahead?
    And conversely what about when we get more than a single benchmark that may show the gap is larger, or smaller for either DX11 or DX12?
    You seem to be basing your entire argument on a single benchmark that for 4/5th of its existence had been designed to work specifically with an API designed to run on a single make of GPU.

  4. #100
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    You do know Microsoft have over 10x the staff of AMD right?
    That didn't help them get DX12 out before Mantle did it? Ever heard of Brooks' Law?

    And you do know that DirectX is a proprietary API that only runs on Microsoft operating system, versus Vulkan (aka what's left of Mantle 1.0) that's open and portable across multiple platforms, heck if you don't believe me just read the Wiki or search Goolge, TBH saying that you think DX12 is derived from Mantle shows a fundamental lack of understanding on the subject.
    I didn't say it WAS Mantle I said it was basically a direct copy of Mantle. Obviously MS has their own policies that they follow however the guts of the APIs are indentical. You actually believe that MS just managed to suddenly throw out DX12 in a few months and somehow by sheer luck made it basically identical to Mantle?

    Have you seen this?



    They went and changed status of not needing a reboot to requiring one. Wow, what an awesome brand-spanking all new shiny API MS and their 10x manpower developed!

    If you had done your research you would see that even though the source code has been available for over a year (apparently), that source code, until 6 months ago, was entirely geared towards AMD's Mantle, it would have been of little interest to anyone but AMD, the company themselves even say they've been working closely with AMD since their inception over two years ago.
    I'm well aware that the game started off on Mantle, that was what made them consider it in the first place. The point is DX12 has been available for over a year and all Nvidia could do was make stuff worse than on DX11. Explain to me why they got the performance they did on DX11 but not on DX12 if it's all down to Oxide and AMD working close together? Did you even read the previous link I gave you where they clearly explained that they even changed their shaders to suit Nvidia?

    Yet we're not comparing over 20 odd games are we, we're comparing how two graphics cards perform in a single benchmark, like for like, and not an aggregate, if we had 20 different DX12 games then maybe we could draw a better conclusion, maybe if we had a greater pool of tests to draw from we would see a different outcome, but we don't, we can only base our conclusion on this one benchmark, maybe it favors AMD cards, maybe it favors Nvidia cards, we wont know until we have more than a single data point.
    And we're not comparing AMDs DX11 performance are we? I don't recall anyone ever asking to see DX7 benchmarks in DX11 games? Why on Earth would AMD bother to optimise for an outgoing, backward API that doesn't suit their architecture? Well, for the same reason Nvidia *would* funnily enough! Which is exactly what I've been saying all along about Nvidia's architectures. No legs, built for DX11 and no future. That's whatever Kepler/Maxwell card you bought has.

    Haven't we already establish that as best WCCFTech is an unreliable source, at worst they publish click bait articles.
    Is Extremetech the same? The found the same results regarding MSAA. Nothing found except Nvidia tears.

    What did Oxide say about this so-called bug?

    Unfortunately, we have to make some corrections because as always there is misinformation. There are incorrect statements regarding issues with MSAA. Specifically, that the application has a bug in it which precludes the validity of the test. We assure everyone that is absolutely not the case. Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally, the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.

    So what is going on then? Our analysis indicates that any D3D12 problems are quite mundane. New API, new drivers. Some optimizations that the drivers are doing in DX11 just aren’t working in DX12 yet. Oxide believes it has identified some of the issues with MSAA and is working to implement workarounds on our code. This in no way affects the validity of a DX12 to DX12 test, as the same exact workload gets sent to everyone’s GPUs. This type of optimization is just the nature of brand new APIs with immature drivers.
    Pure opportunism by Nvidia to blame their crap performance on this when other websites found no such issue. There's a reason why Nvidia are a better marketing than tech company though, they never miss a chance.

    Either way if one party claims there's a bug the only way not to get into a he said she said situation is to remove that supposed bug from you test, that way you place both parties on an equal footing and avoid the complaints that the test could be biased.
    Or just test it with the supposed "bug" and find no difference? Now my Czech isn't so good but I think I can figure out what they're doing here - http://diit.cz/clanek/ashes-singular...x-12-benchmark



    Not one website has found any evidence of this so-called bug. Not one. But it's Oxide, MS, WCCFTech who are lying and poor little Nvidia who is being unfairly picked on right?

    And you're going to base your opinion on a single benchmark? Maybe when you have more than a single data point you can start to draw some type of conclusion, until then making a judgment on how a driver performs seems a little fanboy'ish.
    No there have been plenty of benchmarks and plenty of discussion about AMD's known DX12 advantages. We've seen Titan X get destroyed in draw calls by Fury X



    That was one to ignore too right? Well now there are 2 clear benchmarks showing Nvidia's lack of DX12 suitability.
    Last edited by Jimbo75; 22-08-2015 at 06:37 PM.

  5. #101
    Goron goron Kumagoro's Avatar
    Join Date
    Mar 2004
    Posts
    3,147
    Thanks
    37
    Thanked
    170 times in 139 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Isn't one of the points, shown by the benchmarking, that AMD didn't optimise the code path in their drivers so you can see the difference between dx11 and 12. Where as nvidia had done code path ops for dx11.

  6. #102
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Kumagoro View Post
    Isn't one of the points, shown by the benchmarking, that AMD didn't optimise the code path in their drivers so you can see the difference between dx11 and 12. Where as nvidia had done code path ops for dx11.
    It's possible (probable even) that AMD didn't bother to optimise for DX11 at all, either due to lack of resources or even that they wanted people to look at it and see how much of an improvement DX12 would mean for their cards. Marketers work in mysterious ways - they market around the average persons mentality, not enthusiasts. It's quite possible that the average tech buyer looks as this and sees AMD making massive gains in DX12 while Nvidia goes backwards, they probably don't even bother about the fact that AMD's DX11 is terrible because they'll be using DX12.

    I guess Nvidia knew their DX12 performance was going to be bad due to hardware limitations so they've made the best they can with DX11. The problem with DX11 and the benchmarks we've seen is that they don't show minimums yet, and even though Nvidia's DX11 numbers look good I would bet anything that they suffer horrendous drops in minimum framerate. We've seen the difference Mantle made to minimums in plenty of games so far, DX12 will be the same.

  7. #103
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    That didn't help them get DX12 out before Mantle did it? Ever heard of Brooks' Law?
    Ever heard of a console made by Microsoft called the Xbox?
    Why do you think AMD even bothered with Mantle, why do you think they gave the code away when Microsoft finally bothered to update their gaming API to better serve more than a single core CPU.

    Quote Originally Posted by Jimbo75 View Post
    I didn't say it WAS Mantle I said it was basically a direct copy of Mantle. Obviously MS has their own policies that they follow however the guts of the APIs are indentical. You actually believe that MS just managed to suddenly throw out DX12 in a few months and somehow by sheer luck made it basically identical to Mantle?
    Well actually you said DX 12 is a rebadged Mantle, the generally accepted meaning of rebading is to take an already existing product and applying your own brand, logo, or badge, so yes it does seem you said DX12 was a direct copy of Mantle.

    I don't believe Microsoft suddenly threw together DX12, what i believe is that they built on 20 years of development and took a further year to implement changes to better deal with multi-core CPUs.

    Quote Originally Posted by Jimbo75 View Post
    They went and changed status of not needing a reboot to requiring one. Wow, what an awesome brand-spanking all new shiny API MS and their 10x manpower developed!
    What has rebooting got to do with anything?
    And who said DX12 was a brand-spanking all new shiny API? In all of this have you not even known what DX stand for, Microsoft have been using DirectX in one form or another since Windows 95/98.

    Quote Originally Posted by Jimbo75 View Post
    I'm well aware that the game started off on Mantle, that was what made them consider it in the first place. The point is DX12 has been available for over a year and all Nvidia could do was make stuff worse than on DX11. Explain to me why they got the performance they did on DX11 but not on DX12 if it's all down to Oxide and AMD working close together? Did you even read the previous link I gave you where they clearly explained that they even changed their shaders to suit Nvidia?
    DX12 has been available for over a year? They only announced it a year ago, and only released it last month.
    On what metric are you basing your claim that Nvidia made thing worse in DX12 versus their DX11 performance? Because everything (excluding the supposedly buggy AA in AoS) shows they've made a slight improvement, granted they haven't improved as much as AMD, but then they were already starting from a fairly optimised code path in the first place.

    Quote Originally Posted by Jimbo75 View Post
    And we're not comparing AMDs DX11 performance are we? I don't recall anyone ever asking to see DX7 benchmarks in DX11 games? Why on Earth would AMD bother to optimise for an outgoing, backward API that doesn't suit their architecture? Well, for the same reason Nvidia *would* funnily enough! Which is exactly what I've been saying all along about Nvidia's architectures. No legs, built for DX11 and no future. That's whatever Kepler/Maxwell card you bought has.
    The reason we're not comparing AMDs DX11 performance is because it sucks, but you seem to be conveniently ignoring that to better fit into this mind set you seem to have.
    It also seem a bit early to be saying DX1 has no legs and has no future, unless you really think Windows 10 is going to be a roaring success? Even if DX12 does see wide spread adoption it's going to be years before we see any DX12 exclusive games, until then myself, and probably 98% of the buying public are going to make decisions based on REAL world performance and what is CURRENTLY available, and not pray to the gods that the planets align just right so their £50 cheaper GPU may perform better.

    Quote Originally Posted by Jimbo75 View Post
    Is Extremetech the same? The found the same results regarding MSAA. Nothing found except Nvidia tears.
    And as I have said many times now the only way to avoid the complaints that the test could be biased is to removed the supposed bug from your testing, not to play the game of he said, she said, but hey if your happy to spend the next decade trying to prove the unprovable then go right ahead, personally I choose not to focus on who said what but on what IS provable.

    Quote Originally Posted by Jimbo75 View Post
    Or just test it with the supposed "bug" and find no difference? Now my Czech isn't so good but I think I can figure out what they're doing here - http://diit.cz/clanek/ashes-singular...x-12-benchmark



    Not one website has found any evidence of this so-called bug. Not one. But it's Oxide, MS, WCCFTech who are lying and poor little Nvidia who is being unfairly picked on right?
    Sorry but since when has a screenshot been evidence of a bug?
    Like I said if one side says there's a bug in a particular feature the only way to prevent criticism of your testing methods is to test with that feature off, or wait until the supposed bug is fixed.

    Quote Originally Posted by Jimbo75 View Post
    No there have been plenty of benchmarks and plenty of discussion about AMD's known DX12 advantages. We've seen Titan X get destroyed in draw calls by Fury X



    That was one to ignore too right? Well now there are 2 clear benchmarks showing Nvidia's lack of DX12 suitability.
    And now synthetic benchmarks measuring a single aspect of a GPU are to be take as a measurement of REAL WORLD performance?

    You can buy what ever card and from whom ever you like, it's a free country that's your choice, but for 98% of the buying public their going to base their purchasing decision on REAL WORLD prices and performance, and not an alpha release benchmark that was initially designed to work with a single manufactures cards, or synthetic benchmarks that measure a single aspect of a GPU, or any other of this bupkis you keep coming up with to justify your AMD fanboyism.

    Normal people are going to base their purchasing decision in the hear and now, on REAL world games, REAL world reviews, REAL world prices, not on if', but's, and maybe's.

  8. #104
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Ever heard of a console made by Microsoft called the Xbox?
    Yes I have...what relevance does this supposedly have to Mantle exactly?

    Why do you think AMD even bothered with Mantle, why do you think they gave the code away when Microsoft finally bothered to update their gaming API to better serve more than a single core CPU.
    AMD created Mantle as a way to equalise multi-core CPUs and more readily take advantage of compute features in GCN, like Async shaders. Why do you think they did it? To help Microsoft?

    Well actually you said DX 12 is a rebadged Mantle, the generally accepted meaning of rebading is to take an already existing product and applying your own brand, logo, or badge, so yes it does seem you said DX12 was a direct copy of Mantle.
    That's pretty much what happened yes, minus a few very minor changes to make it fit into MS's ecosystem.

    I don't believe Microsoft suddenly threw together DX12, what i believe is that they built on 20 years of development and took a further year to implement changes to better deal with multi-core CPUs.
    Is this the same MS that has 10x more human resources than AMD?

    What has rebooting got to do with anything?
    Do you have an inability to gather information from a graphic? Let me point it out then.



    On the left we have Mantle, on the right we have DX12. If you read the yellow highlighted parts you can see the - identical - features of both when it comes to error checking. The only difference is that with Mantle there is no need to reboot the system, with DX12 there is. That's it. Now what do you think the chances are of that happening twice, independently of each other?

    And who said DX12 was a brand-spanking all new shiny API? In all of this have you not even known what DX stand for, Microsoft have been using DirectX in one form or another since Windows 95/98.
    I quite clearly pointed out that DX12 was the biggest change in the API for 20 years. The reason for that is that DX12 is the first one that hasn't simply been an extension of what went before. It *is* a completely all-new, shiny API because to all intents and purposes it *is* Mantle.

    DX12 has been available for over a year? They only announced it a year ago, and only released it last month.
    To *you* maybe, not to developers.

    On what metric are you basing your claim that Nvidia made thing worse in DX12 versus their DX11 performance? Because everything (excluding the supposedly buggy AA in AoS) shows they've made a slight improvement, granted they haven't improved as much as AMD, but then they were already starting from a fairly optimised code path in the first place.
    On which cherry picked benchmark did you see a "slight improvement"? I'm pretty sure that for every "slight improvement" I can find a "less-slight regression".

    And as I have said many times now the only way to avoid the complaints that the test could be biased is to removed the supposed bug from your testing, not to play the game of he said, she said, but hey if your happy to spend the next decade trying to prove the unprovable then go right ahead, personally I choose not to focus on who said what but on what IS provable.
    Right so everytime one of the companies complains about their dire performance we should just switch off whatever feature is supposed to be broken? Does that include PhysX and CrapWorks too?

    Sorry but since when has a screenshot been evidence of a bug?
    Like I said if one side says there's a bug in a particular feature the only way to prevent criticism of your testing methods is to test with that feature off, or wait until the supposed bug is fixed.
    There WAS NO bug, it was a pure manufactured lie by Nvidia. Just how green are you? Did you read what Extremetech said?

    For all the fuss about Oxide’s supposed MSAA bug, we expected to see Nvidia’s performance tank or some other evidence of a problem. Screenshots of DX12 vs. DX11 with 4x MSAA revealed no differences in implementation, as per Dan Baker’s blog post.
    Also...

    In DirectX 12, Nvidia’s 4x MSAA scores were 14.5% lower at 4K and 12% lower at 1080p. AMD’s results were 12% and 8.2% lower respectively. It’s not news to observe that AMD’s GPUs often take less of a performance hit with MSAA enabled than their Nvidia counterparts, so the fact that the DX12 API is marginally slower for Nvidia with 4x MSAA enabled than the highly optimized DX11 path doesn’t explain why Nvidia came out so strongly against Ashes of the Singularity or its MSAA implementation.
    So we have at least 2 tech sites independently verifying the lack of such bug, MS verification and Oxide themselves telling it how it is. On the other side we have poor little Nvidia, who never tell any tales at all (how much memory on your 970 again?), claiming that the benchmark is rigged against them. I guess they just mean in DX12 though?

    Normal people are going to base their purchasing decision in the hear and now, on REAL world games, REAL world reviews, REAL world prices, not on if', but's, and maybe's.
    Normal people are sheep who buy on marketing. AMD guys always made the smarter choice because we know who the real tech company is.

  9. #105
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    Yes I have...what relevance does this supposedly have to Mantle exactly?
    Is your fanboyism effecting your ability to read what you've previously claimed, selective memory maybe?

    You claimed that Microsoft having more staff than AMD didn't help them get DX12 out before Mantle, in case you haven't noticed Microsoft relegated gaming on the PC to the minor leagues when they got into the console market, if you don't even have a basic understanding of why Mantle was needed in the first place then there's really no hope for you, is there.

    Quote Originally Posted by Jimbo75 View Post
    AMD created Mantle as a way to equalise multi-core CPUs and more readily take advantage of compute features in GCN, like Async shaders. Why do you think they did it? To help Microsoft?
    For real? You really think that's why AMD created Mantle, if that's true why didn't AMD start development of Mantle a few years before GCN hit the market, why didn't AMD have mantle ready for when CGN rolled out some 4 years ago?

    AMD was forced into creating Mantle because Microsoft started to only develop new version of DirectX inline with new generations of their beloved console, development of DirectX on the PC basically became tied to when Microsoft released a new console from 2002 onwards.

    Quote Originally Posted by Jimbo75 View Post
    That's pretty much what happened yes, minus a few very minor changes to make it fit into MS's ecosystem.
    And yet even your beloved AMD says that's not what happened.

    Quote Originally Posted by Jimbo75 View Post
    Is this the same MS that has 10x more human resources than AMD?
    Yes, and that year in development is certainly quicker than how long it took AMD to develop an API for a microarchitecture that they released over 4 years ago, a microarchitecture that was probably in development a good 5 years before it hit the market.

    Quote Originally Posted by Jimbo75 View Post
    Do you have an inability to gather information from a graphic? Let me point it out then.
    So you're expecting people to take a random image from who knows where as evidence ? Have you never heard of photoshop, next time try providing a source for you supposed evidence.

    Quote Originally Posted by Jimbo75 View Post
    I quite clearly pointed out that DX12 was the biggest change in the API for 20 years. The reason for that is that DX12 is the first one that hasn't simply been an extension of what went before. It *is* a completely all-new, shiny API because to all intents and purposes it *is* Mantle.
    And how do you know DirectX has been re-written from the ground up, hasn't just been another extension of what went before, how exactly do you know this?
    Again provide some evidence otherwise it just comes across as the ramblings of fanboi.

    Quote Originally Posted by Jimbo75 View Post
    To *you* maybe, not to developers.
    Yea because developers are well renowned for not being able to leak information ain't they? Even if they could keep it secret how long do you think it takes to develop a game, if developers had access to the code before the announcement why don't we have DX12 game being released now, there's only 6 games being developed that support DX12, and not a single game released with DX12 support.

    Quote Originally Posted by Jimbo75 View Post
    On which cherry picked benchmark did you see a "slight improvement"? I'm pretty sure that for every "slight improvement" I can find a "less-slight regression".
    So you can't provide that metric I asked for then?
    That metric that you're basing your claim that Nvidia made thing worse in DX12 versus their DX11 performance?

    Quote Originally Posted by Jimbo75 View Post
    Right so everytime one of the companies complains about their dire performance we should just switch off whatever feature is supposed to be broken? Does that include PhysX and CrapWorks too?
    If there's a claimed bug then yes, if it's a permanent feature then no, you do know the difference between a permanent feature and a bug right?

    Quote Originally Posted by Jimbo75 View Post
    There WAS NO bug, it was a pure manufactured lie by Nvidia. Just how green are you? Did you read what Extremetech said?
    Sorry but didn't I say the best way not to get into a he said, she said situation is avoid getting into that situation in the first place?
    And no I haven't read what Extremetech said because after flicking through your previous post I can't find any links to Extremetech, forgive me for not reading everything you said and only skimming to find links, but reading through your previous posts started giving me headaches.

    Quote Originally Posted by Jimbo75 View Post
    So we have at least 2 tech sites independently verifying the lack of such bug, MS verification and Oxide themselves telling it how it is. On the other side we have poor little Nvidia, who never tell any tales at all (how much memory on your 970 again?), claiming that the benchmark is rigged against them. I guess they just mean in DX12 though?
    Seriously you still want to get into he said, she said, didn't your mommy teach you anything?

    Even if as you claim at least 2 tech sites independently verified the lack of such a bug you're still basing your opinion on a gaming benchmark that's still in alpha, that for 4/5th of it's life was developed for AMDs API, on AMD hardware, and in partnership with AMD themselves, the fact that AMD only manages to best the equivalent card from their competitor by 10-20% after all that time tell us more about AMD (IMHO) than their competitor, after all their competitor probably only started to pay attention to AoS when AMD abandoned Mantle 6 months ago, when the developers started porting their game over to DX12.

    Quote Originally Posted by Jimbo75 View Post
    Normal people are sheep who buy on marketing. AMD guys always made the smarter choice because we know who the real tech company is.
    And if that isn't both insulting to everyone, and showing you true fanboyism IDK what is.

  10. #106
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Is your fanboyism effecting your ability to read what you've previously claimed, selective memory maybe?

    You claimed that Microsoft having more staff than AMD didn't help them get DX12 out before Mantle, in case you haven't noticed Microsoft relegated gaming on the PC to the minor leagues when they got into the console market, if you don't even have a basic understanding of why Mantle was needed in the first place then there's really no hope for you, is there.
    So why are they doing it now? You're all over the place on this because you are trying to convince yourself of facts that don't exist.

    For real? You really think that's why AMD created Mantle, if that's true why didn't AMD start development of Mantle a few years before GCN hit the market, why didn't AMD have mantle ready for when CGN rolled out some 4 years ago?
    Because contrary to your belief, creating a ground-breaking API like this isn't something that happens in a matter of months. Why doesn't Nvidia have Pascal ready for DX12? Why do they suck at VR when it's due in a couple of months time?

    AMD was forced into creating Mantle because Microsoft started to only develop new version of DirectX inline with new generations of their beloved console, development of DirectX on the PC basically became tied to when Microsoft released a new console from 2002 onwards.
    So why wasn't Nvidia "forced" to make a new API? Where is theirs? Why did AMD tell devs to code on Mantle as preparation for DX12?

    So you're expecting people to take a random image from who knows where as evidence ? Have you never heard of photoshop, next time try providing a source for you supposed evidence.
    So now you're claiming that tech websites are photoshopping fake evidence? Have you actually had a look at the crap you're saying in your vehement defence of Nvidia's morals? How much memory on that 970?

    Yea because developers are well renowned for not being able to leak information ain't they? Even if they could keep it secret how long do you think it takes to develop a game, if developers had access to the code before the announcement why don't we have DX12 game being released now, there's only 6 games being developed that support DX12, and not a single game released with DX12 support.
    Because contrary to your ludicrous imaginings about software development, game development also takes a very long time.

    So you can't provide that metric I asked for then?
    That metric that you're basing your claim that Nvidia made thing worse in DX12 versus their DX11 performance?
    You can clearly see that Nvidia suffers more regressions than positives.





    http://www.computerbase.de/2015-08/d...ty-2560-x-1440

    Funny how the only place it isn't seen is your favourite site, PcPer? Ever thought about checking out some less biased sources?

    If there's a claimed bug then yes, if it's a permanent feature then no, you do know the difference between a permanent feature and a bug right?
    Yes, whining when they lose is a permanent feature of Nvidia.

    Even if as you claim at least 2 tech sites independently verified the lack of such a bug you're still basing your opinion on a gaming benchmark that's still in alpha, that for 4/5th of it's life was developed for AMDs API, on AMD hardware, and in partnership with AMD themselves, the fact that AMD only manages to best the equivalent card from their competitor by 10-20% after all that time tell us more about AMD (IMHO) than their competitor, after all their competitor probably only started to pay attention to AoS when AMD abandoned Mantle 6 months ago, when the developers started porting their game over to DX12.
    Right, it couldn't just maybe be that Nvidia fabricated a terrible excuse for their awful performance in DX12, could it? Check the scores with and without MSAA. You see the common theme, right?

  11. #107
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    So why are they doing it now? You're all over the place on this because you are trying to convince yourself of facts that don't exist.
    Do you know nothing?
    1. AMD forced their hand with Mantle.
    2. Microsoft recently released a new console.

    Quote Originally Posted by Jimbo75 View Post
    Because contrary to your belief, creating a ground-breaking API like this isn't something that happens in a matter of months. Why doesn't Nvidia have Pascal ready for DX12? Why do they suck at VR when it's due in a couple of months time?
    No one is talking about months, AMD knew ahead of time by at least 5-6 years that GCN was a highly parallel GPU, then it was another 4 years after it's release before they announce Mantle, that's almost a decade. They designed GCN anticipating that because multi-core CPUs had started to become widely adopted, that a GPU would perform better working in parallel, when Microsoft dragged its feet by not updating DirectX to work better with multi-core CPUs they were forced to design their own API, when Microsoft finally announced DX12 they abandoned it.

    Quote Originally Posted by Jimbo75 View Post
    So why wasn't Nvidia "forced" to make a new API? Where is theirs? Why did AMD tell devs to code on Mantle as preparation for DX12?
    Are you not even paying attention to what's been said in this thread, or are you being deliberate obtuse?

    Nvidia didn't need a new API because they designed their GPU to work best with a serial workload, they didn't need a new API because that's how DirextX has worked since its inception, and probably will for years to come judging by how popular DX12 is with developers and how Microsoft have restricted its use to Windows 10.

    Quote Originally Posted by Jimbo75 View Post
    So now you're claiming that tech websites are photoshopping fake evidence? Have you actually had a look at the crap you're saying in your vehement defence of Nvidia's morals? How much memory on that 970?
    What website would this be? Because IIRC you haven't provided a source for that image, as far as it comes from some random photobucket account so could be from anywhere. I'm also not defending Nvidia, I'm questioning the BS you keep claiming are facts, I'm questioning your beliefs, because that's what there are, nothing more than the beliefs of a fanboi who seems incapable of being objective.

    Quote Originally Posted by Jimbo75 View Post
    Because contrary to your ludicrous imaginings about software development, game development also takes a very long time.
    That depends on how many resources you place on the project, they can take as little as a year, or many, many years, but even for a relatively small studio it can take 2-3 years, just like your much beloved Oxide games that took two years to reach alpha, and will probably take another year or two before release.

    No matter how long it takes though why haven't more developers announced DX12 titles if its been available to them for over a year, as you claim? Do you really think it takes over a year just to decide and announce that your going to develop a game based on DX12, or included DX12 in a game that you're already developing? I mean it only took Oxide games 6 months to port their title over to DX12.

    Quote Originally Posted by Jimbo75 View Post
    You can clearly see that Nvidia suffers more regressions than positives.
    On a gaming benchmark that's still in alpha, that for 4/5th of it's life was developed for AMDs API, on AMD hardware, and in partnership with AMD themselves.

    I'll tell you what, you spend your money on if's, but's, maybes, hopes, and your misplaced beliefs, the rest of us will spend our money on REAL products, REAL results, and how the ecosystem ACTUAL is when we buy our next graphics card, not how the ecosystem MAY pan out 5-10 years down the line.

  12. #108
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Do you know nothing?
    1. AMD forced their hand with Mantle.
    2. Microsoft recently released a new console.
    Yes, built on AMD technology. How long do you think the development for that took?

    Nvidia didn't need a new API because they designed their GPU to work best with a serial workload, they didn't need a new API because that's how DirextX has worked since its inception, and probably will for years to come judging by how popular DX12 is with developers and how Microsoft have restricted its use to Windows 10.
    Well according to what you just said it'll be how many years before Nvidia will be DX12 ready? If GCN was 5-6 years in development, how long will it be before Nvidia gets a parallel arch suitable for DX12? 2018? Yet if you listen to them they're clearly telling people that Pascal will be DX12. Christ they've even convinced those who are really stupid that they have DX 12.1! How did they do that when they had NO IDEA about Mantle/DX12 until 2 years ago?

    You're just trying to spin a bunch of nonsense but like most of your ilk you tie yourself in knots and your argument gets crushed by logic.

    What website would this be? Because IIRC you haven't provided a source for that image, as far as it comes from some random photobucket account so could be from anywhere. I'm also not defending Nvidia, I'm questioning the BS you keep claiming are facts, I'm questioning your beliefs, because that's what there are, nothing more than the beliefs of a fanboi who seems incapable of being objective.
    Really? You're so objective that you don't even see links, including when you've quoted them? Maybe look again? The website was clearly linked to with the graphics below. I even mentioned about how bad my Czech was? Google translate it, see what it says. Read Extremetech's article, see what they say about MSAA. Look at the benchmarks showing proof.

    But no, you'd much rather just stick to PcPer right? That completely objective website that just somehow seemed to get completely opposite results from the rest.

    On a gaming benchmark that's still in alpha, that for 4/5th of it's life was developed for AMDs API, on AMD hardware, and in partnership with AMD themselves.

    I'll tell you what, you spend your money on if's, but's, maybes, hopes, and your misplaced beliefs, the rest of us will spend our money on REAL products, REAL results, and how the ecosystem ACTUAL is when we buy our next graphics card, not how the ecosystem MAY pan out 5-10 years down the line.
    Oxide pointed it all out, did Nvidia disagree? http://www.oxidegames.com/2015/08/16...-of-a-new-api/

    How useful is the benchmark?

    It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games.
    No doubt you'll try to spin that as well. Just face it, Kepler and Maxwell aren't fit for DX12. Nvidia had plenty of time to optimise but the game is too complex for their lightweight, serial architecture. When Pascal is released both Kepler and Maxwell will be dropped like a brick.

    Oh and incidentally - Nvidia isn't saying they didn't get time to optimise. Nvidia isn't saying AMD are sponsors of Oxide. Nvidia stopped suggesting that their awful performance is down to the MSAA bug after their lie was rumbled by the press.

    All they are saying is that the benchmark isn't indicative of a typical DX12 game (in other words, we suck at this type of DX12 game). It's *you* who is making the rest of the excuses.
    Last edited by Jimbo75; 23-08-2015 at 09:13 PM.

  13. #109
    Senior Member
    Join Date
    Aug 2003
    Posts
    6,585
    Thanks
    0
    Thanked
    246 times in 208 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by kalniel View Post
    Yes, as always, new stuff is better than current stuff.
    Remember the first Pentium 4 launch?

    ---------------------------------------------------

    Aside from some of the hostilities I am sensing, this has been a fascinating thread. To be honest, the last time I spent this much time reading up on GPU tech was during the Fermi generation so there was quite a bit of catching up to do.

    Not sure if DirectX 12 is going to be any different, but in the past, it usually took a while before developers made good use of the new DirectX features, and by then, competition would have caught up. All things being equal, better to have a card that supports it than not, but otherwise, other things would take precedence (cost, performance of older DX etc.), unless of course there is that one game you absolutely want to play as soon as possible in all it's glory that uses it. Or you don't upgrade often at all.

    As someone who nowadays only buy games on sales, and with the exception of Win7 (due to how much I despised Vista) rarely upgrade to the latest OS until at least SP1, support for a brand new DX is a tie-breaker rather than high in my priority. YMV etc.
    Last edited by TooNice; 24-08-2015 at 08:06 AM.

  14. Received thanks from:

    kalniel (24-08-2015)

  15. #110
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Look Jimbo75 it's obvious to everyone that you're a rabid fanboi and no matter how many REAL world reviews or benchmarks get put in front of you you're going to come out with some BS to prove how evil and crap Nvidia are, how wonderful AMD are for saving us from the evil empire, how they apparently developed DirectX for Microsoft, how if only everyone bought AMD the world would be a better place, and that's fine you can believe whatever you like.

    In the meantime the rest of us will base whatever opinion we form on how things are in the REAL world, not some imaginary fairy tail, not some form of delusional state of mind, but REAL products, in REAL reviews, on REAL tests.

    TBH I have my suspicions that you're just on these forums astroturfing for either AMD or Oxide games as it seems your incapable of reasoned thought, It's been fun reading how far down the rabbit hole your thought process descended and its kept me amused over the weekend, fortunately it's back to work today so I've got more important things to do now, good luck in your crusade, and I hope you seek medical help for your delusional state of mind, peace.

  16. #111
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Look Jimbo75 it's obvious to everyone that you're a rabid fanboi and no matter how many REAL world reviews or benchmarks get put in front of you you're going to come out with some BS to prove how evil and crap Nvidia are, how wonderful AMD are for saving us from the evil empire, how they apparently developed DirectX for Microsoft, how if only everyone bought AMD the world would be a better place, and that's fine you can believe whatever you like.
    Look Corky, it's obvious to everyone that you're a rabit fanboi who sees everybody else as the problem when it's clear that Nvidia is just up to their usual tricks. You've gone through laughable excuses like accusing sites of photoshopping evidence (lol) and pretending that Nvidia had no time to optimise for DX12 even though they've been doing it daily for over a year and clearly spent plenty of time optimising for DX11.

    You've denied the proof of multiple graphs from the majority of sites proving that Nvidia's DX12 performs worse than DX11. Instead you stuck to the same single website, PcPer, which most people realise were bought out by Nvidia a long time ago. At the same time you accuse WCCFTech as being a bad source.

    In the meantime the rest of us will base whatever opinion we form on how things are in the REAL world, not some imaginary fairy tail, not some form of delusional state of mind, but REAL products, in REAL reviews, on REAL tests.
    AotS is a real game, this is the point you keep trying to make us forget. The REAL results show cratering Nvidia DX12 performance in site after site. The REAL tech press found Nvidia's MSAA excuse to be a fabricated lie.

    TBH I have my suspicions that you're just on these forums astroturfing for either AMD or Oxide games as it seems your incapable of reasoned thought, It's been fun reading how far down the rabbit hole your thought process descended and its kept me amused over the weekend, fortunately it's back to work today so I've got more important things to do now, good luck in your crusade, and I hope you seek medical help for your delusional state of mind, peace.
    Done already? I was just getting warmed up as well. I have so many more things to teach you about getting a good DX12 and VR experience.

  17. #112
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    Well according to what you just said it'll be how many years before Nvidia will be DX12 ready? If GCN was 5-6 years in development, how long will it be before Nvidia gets a parallel arch suitable for DX12? 2018? Yet if you listen to them they're clearly telling people that Pascal will be DX12. Christ they've even convinced those who are really stupid that they have DX 12.1! How did they do that when they had NO IDEA about Mantle/DX12 until 2 years ago?
    All that speculation, based on no evidence at all and the funniest thing is that if we believe half of what you have said you want us all to buy AMD cards because they didn't bother to optimise their drivers for DX11 which has been out for 6 YEARS.

    You might need a handkerchief for that foam around your mouth......I would hate that to get into your AMD hardware.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

Page 7 of 13 FirstFirst ... 45678910 ... LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •