Page 11 of 13 FirstFirst ... 8910111213 LastLast
Results 161 to 176 of 196

Thread: AMD discrete GPU market share eroded to less than 20 per cent

  1. #161
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by watercooled View Post
    But the rest of the thread is irrelevant - I was directly answering the recent posts you made in reply to mine, I haven't said a word about things I haven't read.
    Maybe that's the problem hear then.
    All I see you doing is rehashing what Jimbo75 has already gone over.

  2. #162
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Well I'm really not just repeating what anyone else has said, I'm saying things how I see them having rejoined the tail end of the discussion. I see a lot of strange conspiracies and reasoning which makes little objective sense to me, so maybe that's where we agree?

  3. #163
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    But you are repeating what's already been said, let's see if I can sum up what the last 11 pages seem to just keep ploughing up in one form or another.

    A SINGLE game/benchmark that's still in alpha, that spent the majority of it's life being developed on an API used by a single semiconductor company, that the developer said is lacking some optimisations in DX12 that would be present in the drivers of DX11, that depends massively on the developers efforts in optimising DX12 code for particular hardware, shows big increase for the aforementioned semiconductor company when comparing DX11 to DX12, and a decrease in DX12 performance when compared to DX11 for a semiconductor company that the developer has only spent 6-12 month optimising for.

    Now it can be argued it's because of X, or Y, but in the end with only a single game isn't it a little early to draw any conclusions? Far from strange conspiracies and reasoning which makes little objective sense, if you'd spent the time its taken to argue a point to instead read what's already been said, you would've seen that far from strange conspiracies or biased conclusions being made that most people have been saying it's to early to call it either way, and that by the time it's known it will probably make little difference.

  4. #164
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    The strange conspiracies I'm referring to are the suggestions that it's somehow sabotaged to run worse on Nvidia hardware, that well-known review sites would all go out of their way to make it run worse by enabling some option or other. It seems it started as pointing the blame at an MSAA bug - so despite the fact the developers have stated that that is misinformation, sites have still ran it with MSAA disabled to prove otherwise and avoid being accused of gaming the results, which it seems has happened anyway.

    Strange reasoning would include grouping all AA methods together; if it's not MSAA then they must have enabled some other AA to skew the results, which TBH completely ignores the fact that different AA methods are performed completely differently so that makes no sense either. And TBH something is drastically wrong if any functioning AA implementation is destroying performance as much as those numbers show, especially popular post-fx methods like FXAA. If it were some option killing performance as much as it was, wouldn't some site have discovered it by now, disabled it, and posted the results? It wouldn't be terribly hard to do and they'd probably attract a lot of traffic.

    Just to be clear, I have made no claims that this is some ultimate benchmark of DX11 vs DX12 or AMD vs Nvidia performance - just that many of the theories I've seen in this thread and elsewhere trying to downplay it are deeply flawed.

  5. #165
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Perhaps I'm missing something then because I can't recall anyone saying it was deliberately made to run worse on Nvidia hardware, other than the developer has probably spent less time optimizing the game/benchmark for Nvidia hardware.

    I also don't recall anyone saying review sites have enabled some option to make it run worse, although without knowing what the supposed MSAA bug involves how would anyone know if they did, or didn't, that's why to rule any possibility of bias (imho) ALL forms of AA should have been disabled.

    The developers first said there was a bug in MSAA and then changed their mind and said there wasn't, under those circumstances I would say the developer is the one spreading misinformation, but without knowing either way the only way to be sure (imho) would be to run the test with ALL forms of AA disabled, yes different AA methods are performed completely differently but without knowing the details of what the supposed MSAA bug is how would you know it's not effecting other forms of AA, simple matter is you don't.

    The only reason people are trying to downplay it is because the fanbois like Jimbo75 HAVE taken this single alpha release of a game/benchmark as the ultimate benchmark of DX11 vs DX12 or AMD vs Nvidia performance by saying things like, and I quote "because AMD hardware (especially GCN) has the highest market share", "we only hear about Nvidia's involvement when they're complaining about something", "It's testament to the sad state of the current tech press when WCCFtech are making articles 100x better than Hexus.", that people "don't want to see evidence", that "GameWorks is designed to gimp GCN *and* Kepler" and that "every game tainted by it so far has been wrecked by it".

    We have doozy's like the fanbois saying Nvidia has "(under 1/3rd) in market share", that "faux tech "enthusiasts" should be embarrassed", that the 80% of people that this very article says owns discrete Nvidia GPUs are "sheeple being marketed into believing they made the informed choice"

    And then we have out right falsehoods like "DX12 is Mantle rebadged" that "If not for AMD [we] wouldn't have DX12"

    Then again if you took the time to read this thread you would seen the utter drivel being spouted for yourself, and how this alpha release of a game/benchmark has been touted as the ultimate benchmark of DX11 vs DX12 or AMD vs Nvidia performance, that's why before starting this conversation with you I said the following..

    Quote Originally Posted by Corky34 View Post
    To be fair I went over all this with Jimbo75 so you'll have to forgive me for not wanting to go over old ground, agree or disagree with what I've said that's up to you, at this point I'm past the point of caring, no offense intended, I just don't want to subject people to yet more arguments over whether a supposed bug in the MSAA code is, or is not, distorting the results some sites got, it just turns into a He said She said situation and the best way not to go down that road is to test without AA (IMO).

  6. #166
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    You don't seem to be past the point of caring, just saying

  7. #167
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    True, I guess I just expected someone who says they came into the debate fresh, without having read most of the thread to not go around accuse people of clutching at straws, or that people are coming up with strange conspiracies and reasoning which makes little objective sense.

  8. #168
    Senior Member
    Join Date
    Mar 2010
    Posts
    2,567
    Thanks
    39
    Thanked
    179 times in 134 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    corky - a reviewer will mention if they are using AA - so when they say ` we turned MSAA OFF` , with no ` and replaced it with` - it means they are not using ANYTHING ELSE>

  9. #169
    Senior Member
    Join Date
    Aug 2003
    Posts
    6,585
    Thanks
    0
    Thanked
    246 times in 208 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    I have to say that I agree with watercooled and HalloweenJack on that point. Disabling a feature doesn't imply switching it for another.

    Quote Originally Posted by crossy View Post
    [...]You've pretty much nailed it - for all our loudness here, we're only a tiny part of the "buying public" and the overwhelming majority will just take what the manufacturers give them. And lets not kid ourselves AMD have done pretty well with their console offerings and also the very low end APU stuff. Problem is that NVidia have pretty much solid mindshare on the higher end bundled offerings - take a trip to Dell, HP, Lenovo, etc and you'll see that their high end desktops and laptops (those with discrete GPUs) are pretty much exclusively an Intel+NVidia combo.

    [...]

    AMD products are pretty price competitive (always have been) but that's no help if they're seen as inefficient, hard to live with (noisy), low end, poorly supported (drivers and apps) or based on "last years tech". As seems usual with big companies these days the "bean counters" have utterly failed to appreciate that their R&D department is an "asset" not an "overhead". They've culled the "geeks" and now the cupboard is bare.
    Before making my last post, I actually decided to take a brief look at Dell, HP and Lenovo site. I am actually surprised by the range of product they I have, and frankly could not be bothered clicking through all of them. Besides, I don't know which product sell most. I am sure it isn't going to be the high-end/gaming products, but I am less sure if people tend to go for the cheaper one or the mid-range one.

    At a glance, there seem there are a lot more nVidia graphic cards, though it is not quite as bad as trying to find an AMD CPU in the past. For instance in the Alienware, they have a bunch of nVidia options to choose from, and one AMD card. Still, I am surprised to see that occasionally, the AMD card is used for the higher end system of two. For instance, in the US (but not UK), the Dell XPS 8700 desktop comes in two flavours. The standard version comes with an nVidia card, but the more expensive "Special Edition" uses the AMD card. And there is another similar example for the HP Envy line (though in that case, they didn't call it "Special Edition".. it is just the more expensive one).

    I am guessing that the general public who make the bulk of the buyers don't actually have an opinion between AMD and nVidia though. Those who might be a little curious might ask their nerdy friends (that would be us). Most, will probably buy the one that fit the closest to their budget. My folks used to ask my opinion when they want a new laptop, but in recent years, they just go with a brand (system, not component) they trust and pay what they feel is reasonable. More often than not, it'll be an onboard graphic card but every now and then, it'll have a low end nVidia (nVidia pretty much own laptops).
    Last edited by TooNice; 30-08-2015 at 10:07 AM.

  10. #170
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Let's just put all of Corky's nonsense to bed with one post.

    http://www.overclock.net/t/1569897/v...#post_24356995

    Kollock, Oxide: I could see how one might see that we are working closer with one hardware vendor then the other, but the numbers don't really bare that out. Since we've started, I think we've had about 3 site visits from NVidia, 3 from AMD, and 2 from Intel ( and 0 from Microsoft, but they never come visit anyone ;(). Nvidia was actually a far more active collaborator over the summer then AMD was, If you judged from email traffic and code-checkins, you'd draw the conclusion we were working closer with Nvidia rather than AMD
    Kollock, Oxide: Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only 'vendor' specific code is for Nvidia where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that. The only other thing that is different between them is that Nvidia does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don't think it ended up being very significant.
    Kollock, Oxide: The other surprise is that of the min frame times having the 290X beat out the 980 Ti (as reported on Ars Techinica). Unlike DX11, minimum frame times are mostly an application controlled feature so I was expecting it to be close to identical. This would appear to be GPU side variance, rather then software variance.
    Kollock, Oxide: I suspect that one thing that is helping AMD on GPU performance is D3D12 exposes Async Compute, which D3D11 did not. Ashes uses a modest amount of it, which gave us a noticeable perf improvement. It was mostly opportunistic where we just took a few compute tasks we were already doing and made them asynchronous, Ashes really isn't a poster-child for advanced GCN features.
    Kollock, Oxide: Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven't made their way to the PC yet, but I've heard of developers getting 30% GPU performance by using Async Compute. Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC.
    Kollock, Oxide: In the end, I think everyone has to give AMD alot of credit for not objecting to our collaborative effort with Nvidia even though the game had a marketing deal with them. They never once complained about it, and it certainly would have been within their right to do so.
    Kollock, Oxide:--
    P.S. There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally.
    You've just been completely and utterly destroyed Corky. Everything - literally everything you attempted to spin about this has been proven a lie. Everything I told you has been true. Nvidia, when they lose first of all try to cheat, when that fails they resort to lies instead.

    You bought a graphics card that is poorly equipped for DX12, from a company that treats you and everybody else like crap. Just accept your error and move on.

  11. #171
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Still pedaling the BS that a single game/benchmark is the ultimate benchmark of DX11 vs DX12 or AMD vs Nvidia performance then Jimbo75.

    You or anyone else can pick apart anything or everything I've said but ultimately everything I've said was with the intention of showing you how flawed your thinking was, how you had lost all objectivity because your hate for Nvidia and love of AMD had clouded your judgment, and I stand by that no matter what you or a representative from a company that was chosen to be the poster child for the now defunct Mantle say.

  12. #172
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Still pedaling the BS that a single game/benchmark is the ultimate benchmark of DX11 vs DX12 or AMD vs Nvidia performance then Jimbo75.

    You or anyone else can pick apart anything or everything I've said but ultimately everything I've said was with the intention of showing you how flawed your thinking was, how you had lost all objectivity because your hate for Nvidia and love of AMD had clouded your judgment, and I stand by that no matter what you or a representative from a company that was chosen to be the poster child for the now defunct Mantle say.
    It's not the ultimate benchmark, as the Oxide guy quite clearly says.

    You can expect many more DX12 titles with 30%+ gains on AMD hardware, that's the take-home message.

  13. #173
    Senior Member
    Join Date
    Mar 2009
    Posts
    780
    Thanks
    30
    Thanked
    49 times in 38 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    https://www.reddit.com/r/AdvancedMic...g_dx12/cuklm4j

    nVidia: bruh, just disable stuff on your demo so we come on top and we will make it worthwhile?

    Oxide: wat? No ... hell no wtf is wrong with you

    AMD: Get rekt n00b

    nVidia PR: "This demo has bugs, it's not representing correct figures".

  14. #174
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Jimbo75 View Post
    You can expect many more DX12 titles with 30%+ gains on AMD hardware, that's the take-home message.
    If you read that without your rose tinted glasses you would see that's not what he says at all, what he says is that console guys are getting 30% GPU performance by using Async Compute, something that contrary to what the Oxide representative seems to think is supported by Maxwell, and this PDF (page 31) explains why they got such a performance hit when using it.
    All our GPUs for the last several years do context switches at draw call boundaries. So when the GPU wants to switch contexts, it has to wait for the current draw call to finish first.
    Basically Kepler & Maxwell has 1 pipeline which can handle a lot of compute or 1 graphics queues, but it can't do them at the same time without a performance penalty, such a design works fine for DX11 because prior to DX12 there was no way for rendering to occur simultaneously with compute, so there was no need for parallel pipeline/engines like AMD did with GCN.

    TBH I would've expected a developer that's working with an API that gives them greater control of how the hardware deals with their code to have known this, it just goes to show how little (imho) Oxide and yourself understand about the subject your discussing, rather than being objective you and Oxide prefer to be sensationalist.

  15. #175
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Quote Originally Posted by Corky34 View Post
    Basically Kepler & Maxwell has 1 pipeline which can handle a lot of compute or 1 graphics queues, but it can't do them at the same time without a performance penalty, such a design works fine for DX11 because prior to DX12 there was no way for rendering to occur simultaneously with compute, so there was no need for parallel pipeline/engines like AMD did with GCN.
    Wouldn't Physx want to do compute alongside other rendering tasks?

    Ideally at least, it obviously doesn't have to.

  16. #176
    Senior Member
    Join Date
    Mar 2010
    Posts
    2,567
    Thanks
    39
    Thanked
    179 times in 134 posts

    Re: AMD discrete GPU market share eroded to less than 20 per cent

    Corky - are you in the industry?

Page 11 of 13 FirstFirst ... 8910111213 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •