Page 4 of 5 FirstFirst 12345 LastLast
Results 49 to 64 of 79

Thread: AMD announces Radeon VII graphics: Zen 2 on track

  1. #49
    Senior Member spacein_vader's Avatar
    Join Date
    Sep 2014
    Location
    Darkest Northamptonshire
    Posts
    1,330
    Thanks
    69
    Thanked
    338 times in 218 posts
    • spacein_vader's system
      • Motherboard:
      • Asus B85M-G
      • CPU:
      • i5 4460 3.2GHz
      • Memory:
      • 4x4GB Crucial DDR3 1600
      • Storage:
      • 128GB SSD, 256GB SSD
      • Graphics card(s):
      • Asus RX-480 Dual OC 4GB
      • PSU:
      • Corsair HX 520W modular
      • Case:
      • Antec Mini P180
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • BenQ GW2765, Dell Ultrasharp U2412
      • Internet:
      • Origin Fibre Max

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    A few comments on here about the potential abilities of DLSS at 1080p or 1440p, I thought DLSS was only going to be available at 4k?

  2. #50
    Senior Member
    Join Date
    Sep 2016
    Location
    Merseyside
    Posts
    471
    Thanks
    17
    Thanked
    33 times in 27 posts
    • EvilCycle's system
      • Motherboard:
      • ASUS ROG MAXIMUS IX HERO
      • CPU:
      • Intel I7 7700K (OC to 4.8GHz on Corsair H100i V2)
      • Memory:
      • 16GB Corsair Vengeance DDR4 @ 3000MHz
      • Storage:
      • Samsung 840 evo 120GB SSD + 3 x 500GB 72000rpm HDD's
      • Graphics card(s):
      • Gigabyte AORUS GeForce RTX 2080 XTREME
      • PSU:
      • DEEPCOOL DQ 750st
      • Case:
      • Corsair Obsidian Series 750D Airflow Edition
      • Operating System:
      • Windows 10
      • Internet:
      • Virgin Media 380Mb (Fibre Optic)

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by spacein_vader View Post
    A few comments on here about the potential abilities of DLSS at 1080p or 1440p, I thought DLSS was only going to be available at 4k?
    It is up to the game developers as to which resolutions DLSS is trained towards (they pay Nvidia to run the game through their deep learning machine), I have a feeling FF was 4k only because development was halted before it was finished. Apparently BFV will allow DLSS at 1080, 1440 and 4k, but that will be confirmed when we get the patch. My thinking is DLSS is what is going to save the 2060 from being a useless card for ray tracing, and may allow it to run a 4k monitor without DXR.
    Last edited by EvilCycle; 10-01-2019 at 05:21 PM.

  3. #51
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    10,102
    Thanks
    501
    Thanked
    1,041 times in 885 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 16GB 3200MHz
      • Storage:
      • 1TB Linux, 1TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 30 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Samsung 2343BW 2048x1152
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by Spud1 View Post
    OK, as you asked...
    I'm for one finding it interesting to hear your opinions, given you seem to be the target market

    Quote Originally Posted by Spud1 View Post
    it's hard to get developers engaged in building support for a new feature (in gaming) without the hardware out in the wild.
    That's always been the way though. DX10 required people buying Vista, DX9 needed people to buy an Fx5800 when people didn't like either. But usually those products were seen as necessary stepping stones to what was obviously the future, whereas people don't seem convinced that the 2080 feature set is where the future lies or like the 5800 it is worth waiting for the next gen to come out.

    Quote Originally Posted by spacein_vader View Post
    A few comments on here about the potential abilities of DLSS at 1080p or 1440p, I thought DLSS was only going to be available at 4k?
    It's a clever technique, but as someone who often goes back and plays older games I have to wonder if it will trip itself up as driver and DirectX changes alter the rendering and make the learnt renderings of the game less useful and more likely to artefact. Time will tell if it works at any resolution for the likes of me

  4. #52
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    6,955
    Thanks
    249
    Thanked
    252 times in 197 posts
    • Spud1's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2x 2.8ghz Quad Core Xeons (octo-core)
      • Memory:
      • 4gb DDR2 FB-Dimm
      • Storage:
      • 1x1TB, 1x320gb, 2x500gb, 1x250gb, 120GB SSD
      • Graphics card(s):
      • Nvidia Geforce 560Ti
      • PSU:
      • Mac pro PSU
      • Case:
      • Mac Pro Case
      • Operating System:
      • Windows 8
      • Monitor(s):
      • 1x22" LG 3D TFT 1x 19" ViewSonic
      • Internet:
      • 80mb BT Infinity

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by DanceswithUnix View Post
    That's always been the way though. DX10 required people buying Vista, DX9 needed people to buy an Fx5800 when people didn't like either. But usually those products were seen as necessary stepping stones to what was obviously the future, whereas people don't seem convinced that the 2080 feature set is where the future lies or like the 5800 it is worth waiting for the next gen to come out.
    Very true - I have found Raytracing really impressive, but its one of those things that I didn't really appreciate was "missing" until I tried it out. I had never realised for example the fact that game engines only reflect what the player sees on screen at any one point (Planar reflections excluded) - and that things off-screen were just not reflected. Now that i've seen off-screen reflections working in BFV though I notice them missing all over the place in other games. Really looking forward to the implementation in Tomb raider for example, if they can pull it off in a sensible way. Off-screen reflections are only one part of the magic of course, but its something that has stood out for me..and how much better it is than the old Planar trick.

    The initial badly optimised release of BFV's Raytracing didn't really help either (where you *did* see a 50% hit to FPS), even if no one should really have been surprised at an EA game being poorly optimised at launch


    edit: I am aware that there are many tricks that can be used to simulate off-screen reflections, but most produce poor results or are very expensive to use..and are very different to the results that Raytracing can produce.
    Last edited by Spud1; 10-01-2019 at 06:14 PM. Reason: off screen reflections clarification

  5. #53
    Registered+
    Join Date
    Mar 2014
    Location
    Helion Prime
    Posts
    66
    Thanks
    2
    Thanked
    3 times in 3 posts
    • fend_oblivion's system
      • Motherboard:
      • ASUS M5A78L-M/USB3
      • CPU:
      • AMD FX 4100 3.6 GHz Black Edition
      • Memory:
      • G.Skill RipJawsX 8 GB DDR3 @ 1600 MHz
      • Storage:
      • Western Digital Blue 1 TB
      • PSU:
      • Seasonic S1211 520 W
      • Case:
      • Cooler Master K380
      • Operating System:
      • Windows 7 Ultimate Edition 64-bit
      • Monitor(s):
      • Acer P166HQL

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Wish they'd do something about the power efficiency.

  6. #54
    Senior Member
    Join Date
    May 2009
    Location
    Where you are not
    Posts
    655
    Thanks
    258
    Thanked
    53 times in 42 posts
    • Iota's system
      • Motherboard:
      • Asus Maximus Hero XI
      • CPU:
      • Intel Core i7 9700K
      • Memory:
      • CMD32GX4M2C3200C16
      • Storage:
      • 1 x 250GB / 1 x 1TB Samsung 970 Evo Plus NVMe
      • Graphics card(s):
      • Nvidia RTX 2080 FE
      • PSU:
      • Corsair HXi 850
      • Case:
      • Lian Li PC-X500B
      • Operating System:
      • Windows 10 Pro 64-bit
      • Monitor(s):
      • Dell S2716DG
      • Internet:
      • 40Mbps SKY Fibre

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by Spud1 View Post
    It seems an odd choice as surely by chopping the memory down to say 11gb, they would save enough to be able to retail this for £450-£500, at which point it makes sense and becomes a good option for many.
    HBM2 packages are either 8GB or 4GB. I'm not sure you could mix and match the packages to get 12GB and also doubt it would be cost effective to do so, when you would probably get much better pricing buying the 4GB / 8GB HBM2 packages in bulk. It certainly wouldn't lower the costs by £200, and simply going with 8GB defeats the point of pushing the card as a 4K capable contender when it's likely true 4K textures would probably eat into available memory quite quickly.

    Isn't that why Nvidia went with an 11GB configuration? Purely so the card would be able to cope better at 4K?

    Edit: As mentioned, looks like they've gone with 4 x 4GB HBM2 stacks for the 1TB bandwidth, otherwise it would be ~600Mbps with 2 x 8GB stacks.
    Last edited by Iota; 10-01-2019 at 08:27 PM.

  7. Received thanks from:

    Tabbykatze (11-01-2019)

  8. #55
    Registered User
    Join Date
    Jan 2017
    Posts
    2
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    The Radeon VII looks nice, but I was hoping it would be around the price of the RTX 2070. I guess having 8GB of HBM2 would lower the performance too much.

  9. #56
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    10,102
    Thanks
    501
    Thanked
    1,041 times in 885 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 16GB 3200MHz
      • Storage:
      • 1TB Linux, 1TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 30 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Samsung 2343BW 2048x1152
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by Iota View Post
    Isn't that why Nvidia went with an 11GB configuration? Purely so the card would be able to cope better at 4K?
    It's a marketing decision and nothing more. The technical answer would be to populate all 12 memory channels, but that gives you 12GB which is same as Titan so they won't want that. You could fill 4 channels with 1GB chips and 8 channels with 0.5GB chips giving a total of 8GB, but that isn't very special so probably limits the selling price regardless of what the performance would be like. So 11GB makes it not a Titan, but gives owners some e-peen over upper-mid range 8GB card owners. Kind of makes sense.

    As for HBM2 packages, AIUI they come in 1GB die so the current AMD range uses parts with a 4 die stack to give 4GB but parts with only 2 die stacked are available. However, it seems that HBM2 is no longer the exotic and rare thing it once was, so why skimp on it and produce something where people can look at tables of numbers and say "That only has as much ram as an old 1080".

  10. #57
    Senior Member
    Join Date
    May 2014
    Posts
    1,459
    Thanks
    89
    Thanked
    196 times in 140 posts

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by Spud1 View Post
    OK, as you asked...
    Thank you for taking the time to answer my points.

    Quote Originally Posted by Spud1 View Post
    I think that is an opinion - it's hard to get developers engaged in building support for a new feature (in gaming) without the hardware out in the wild. Yes it may have been released 6 months before partners were truly ready (and maybe even Nvidia, given the early hardware issues with the 2080ti line) but I don't think that's strictly relevant to how the cards actually perform.
    The problem is release mentality and what people see at release. If they see no games, no support and difficulties then they have a negative opinion of the launch. The cards performed better than the 10xx series at launch which everyone were happy with but they were told by Nvidia to wait to see if the near double price is worth the investment on features that couldn't be tested or quantified. That is how the human mentality works, sadly.

    Quote Originally Posted by Spud1 View Post
    True enough, but to gamers does it really matter? When people are looking for raw performance and graphical fidelity, which I would suggest that most are, then where the technology has come from isn't of interest.
    Be that as it may, there are other instances where public opinion is soured when someone makes a feature that already exists and calls it "new". Take Apple for instance, so many of their "features" are not new and were implemented by competitors a long time ago so it is a sticking point. What Nvidia did is make a "new" feature from old tech and tried not calling a kettle black. These two features that they headlined were ASIC based light and reflections and Checkerboard Supersampling. Both are great features when done well but when someone says to you they are a god and then you watch them bleed (reducing Ray numbers in BFV to fix perf issues), they start to question whether they made the right choice and those on the fence fall off it.

    Quote Originally Posted by Spud1 View Post
    OK - but to consumers it feels that way - they are being asked to pay the same price for Vega 7 as they are for a card that offers equivalent performance and more features - with no discernible benefit for taking the AMD option. 16GB of memory isn't of practical use to the vast majority of gamers, whereas Raytracing and DLSS both are.
    Are Ray Tracing and DLSS a disernible benefit? Is it a quantifiable benefit right now? We have two games that show it off with a promise of more but only if the developers go ahead with it and lock themselves into the Nvidia ecosystem. Of those two games, BFV has been struggling and implements only one of the features (downgraded) and Final Fantasys implementation of DLSS is only for the 4K elites (sucks to be you 2070 and 2080 users). AMD has far more benefits in DX12 and Vulkan implementations over Nvidia of which the Vega series of GPUs smash the DX12 implementations happily. Is that not a benefit?

    Quote Originally Posted by Spud1 View Post
    That all depends on whether you value Raytracing/DLSS or not. Personally I would not have bought an RTX2080 if raytracing has not been a thing - the cards are stupidly expensive without the added benefit that brings (maybe even with it ), and I would still be running an older generation card for a long time to come if they had no included it. Thats what I don't get about this - AMD are not bringing anything to the table to justify the price tag they have on these cards. Putting aside any preference between the two companies and looking at this logically in today's market, I cannot see why you would take the AMD option at the moment unless you specifically needed that sort of memory..which the vast majority of people, even those gaming at 4k, don't need.

    I look at things similar in the CPU market. If i needed a powerful CPU with lots of cores/threads, then the only sensible option is a Ryzen CPU...Intel don't compete there at the moment. If you want single thread performance however, then Intel tends to win out in most situations, albeit often at a higher price.

    If you want to buy one, or think they are the better option, then thats fine...each to their own. I can only offer my opinion, and am also pointing out again the obvious bias towards AMD that exists on this forum and has done for many years.
    I don't disagree that the price tag is quite high, I have made my opinion on that known, I think they should have targeted the 550-600 price range but unfortunately it is a great set of technologies that are expensive to produce. The Vega series is not a bad series of GPUs if we omit the toastiness for a moment, they perform as expected, consistently and happily in their applications.

    If justifying a price tag has to be brand new headline features then no one should have been buying a new GPU for the past 5 years as there has been no real headline feature development. Ray Tracing is a tiny element of a game and I find it quite odd because games like Dayz on low settings has amazing lighting effects, same with Arma 3, but they don't need specialised hardware to do it. So Ray Tracing Cores (as in, a rushed implementation of tensor cores specialised for one thing) mean nothing to me. Frankly, the whole Tensor implementation should have just been one massive Tensor system which can be dynamically used for anything (including ray tracing).

    Until benchmarks are released, we don't know the memory utilisation of the Vega VII but what we do know is if a game has 4K texture implementation, it can stress even the 11GB of a 1080ti/2080ti so 16GB is an operable figure.

    You say bias, others say Nvidia are fleecing their customers into the ground, others don't care except for what gives the best performance regardless of price. Me, I'm a generalised compute kind of guy, if I have to spend big bucks for an ASIC to get a small increase then I won't do it. So AMD equipment suits me better.

  11. #58
    Senior Member
    Join Date
    Jun 2013
    Location
    MOMBASA
    Posts
    882
    Thanks
    1
    Thanked
    23 times in 21 posts

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    I think the solo purpose PC gaming is shrinking thanks to modern phones, tablets and consoles. Back in the day to experience gaming on a different platform than a console/arcade ...PS1 etc... you just had to get a PC but nowadays you can get good 3D quality graphics on a phone/tablet. Parents who buy pcs for gaming are extremely few but they would save for a console coz consoles are cheaper and easier to manage..no updates, easy to use etc. AMD has seen the future and that's why they don't rush to pc gaming but focus on the mid to low end range coz thats where the money is. Personally I won't buy any card for gaming purposes above $300 coz I still think gaming graphics are too 'artificial' and not life like. Hype train on technology is good coz why not? cost of high end tech does trickle down over the years and does help with innovation.

  12. #59
    Hooning about Hoonigan's Avatar
    Join Date
    Sep 2011
    Posts
    2,039
    Thanks
    138
    Thanked
    372 times in 262 posts
    • Hoonigan's system
      • Motherboard:
      • GIGABYTE X570 AORUS MASTER
      • CPU:
      • AMD Ryzen 7 3700X
      • Memory:
      • 32GB Corsair Dominator Platinum RGB @ 3766MHz - CAS 15
      • Storage:
      • 2TB Gigabyte NVMe 4.0 + 1TB Samsung 970 EVO NVMe + 1TB Corsair MP510 NVMe
      • Graphics card(s):
      • MSI NVIDIA GeForce RTX 2080Ti VENTUS OC
      • PSU:
      • be quiet! Straight Power 11 650W
      • Case:
      • be quiet! Dark Base Pro 900
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • Acer Predator Z35P + ASUS ROG PG279Q
      • Internet:
      • Virgin Media Vivid 350

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by spacein_vader View Post
    A few comments on here about the potential abilities of DLSS at 1080p or 1440p, I thought DLSS was only going to be available at 4k?
    As far as I'm aware, NVIDIA aim to bring this to 1080p, 1440p and 2160p, but we've yet to see anything really. Battlefield V, from what I've heard, are aiming to bring it into their game within this month, so fingers crossed for that! However, I'm now left worrying that I might've messed up.
    I've just gone out and bought a 3440 x 1440 monitor and moved my 1440p to a portrait position. Am I now going to be without DLSS because I've bought an "unpopular" resolution or will I be covered by the 1440p aspect of the resolution? It's a niggling thought and it worries me.

  13. #60
    Senior Member
    Join Date
    Jul 2009
    Location
    West Sussex
    Posts
    1,075
    Thanks
    73
    Thanked
    132 times in 124 posts
    • kompukare's system
      • Motherboard:
      • Asus P8Z77-V LX
      • CPU:
      • Intel i5-3570K
      • Memory:
      • 2 x 8GB Crucial Ballistix Elite PC3-14900
      • Storage:
      • Crucial MX200 | Sandisk Extreme 120GB SSD | WDC 1TB Green | Samsung 1Tb Spinpoint
      • Graphics card(s):
      • Sapphire R9 290 VaporX 7950
      • PSU:
      • Antec 650 Gold TruePower (Seasonic) or Seasonic SII-330
      • Case:
      • Aerocool DS 200 (silenced, 53.6 litres)l)
      • Operating System:
      • Windows 10-64
      • Monitor(s):
      • 2 x Dell P2414H

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Was surprised to see this getting a consumer release.
    Bit of a mixed bag. Seems too expensive for the likely performance. Still good for those trying to avoid Nvidia where possible - people who don't like the way they operate or who still remembers being stung by their inaction for the millions faulty solder parts the sold a few years ago.

    Does seem to show that the original Vega had a poor choice for ROPs and was possible bandwidth starved (although they might have other reason going from 2048-bit to 4096-bit HBM2).

    But the other suspicion is that resource-starved AMD keen to break into the high end gCompute market got Raja the Radeon Group to produce a design heavily focused on gCompute but had neither the research budget or volume to adapt the design for gaming with gaming-centric designs like Nvidia do with their designs like GP100 vs GP102 etc.

    AMD either need a much higher budget to be able to do two lines one aimed at gaming the other at gCompute, or find a way to apply the Zen/Ryzen chiplet idea to GPUs.

    Navi onwards should have some of these ideas.

    EDIT:
    https://www.anandtech.com/show/13852...e-as-ryzen2000
    Seems that their will not be any monster APUs soon. Maybe later if the do a smallish chiplet for Navi.
    Last edited by kompukare; 11-01-2019 at 09:11 PM.

  14. #61
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Manchester
    Posts
    15,043
    Thanks
    1,194
    Thanked
    2,246 times in 1,847 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 1x 8GB DDR4 2400
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by kompukare View Post
    … Does seem to show that the original Vega had a poor choice for ROPs ...
    Yes, for all the "meh" responses, ~ 25% performance boost with less shaders isn't exactly to be sniffed at. Wonder if Navi will see the same rebalancing...

    Quote Originally Posted by kompukare View Post
    EDIT:
    https://www.anandtech.com/show/13852...e-as-ryzen2000
    Seems that their will not be any monster APUs soon. Maybe later if the do a smallish chiplet for Navi.
    "We were told that there will be Zen 2 processors with integrated graphics, presumably coming out much later after the desktop processors, but built in a different design."

    Money on a larger IO chip with IGP fabbed on 14nm and glued to a single chiplet? Could be a relatively easy way to keep GF happy with the WSA and we know that Vega fabs with a *very* nice power curve on 14nm...

  15. #62
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    10,102
    Thanks
    501
    Thanked
    1,041 times in 885 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 16GB 3200MHz
      • Storage:
      • 1TB Linux, 1TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 30 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Samsung 2343BW 2048x1152
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by scaryjim View Post
    Money on a larger IO chip with IGP fabbed on 14nm and glued to a single chiplet? Could be a relatively easy way to keep GF happy with the WSA and we know that Vega fabs with a *very* nice power curve on 14nm...
    That is possible, graphics is going to be way more sensitive to being away from the memory controllers than CPU so a graphics chiplet seems unlikely. One of my key thoughts on the other thread where I wondered if the I/O chip would have an IGP in it was that a single solution for all cases is often a preferable way forwards. Along with a personal hope, I have built too many compute servers that used a cheap £30 graphics card and having just basic integrated graphics would have been really nice. I also couldn't see how that second location for a CPU chiplet could also take a graphics chiplet unless you create a new package for that purpose.

    TBH having read that link my money now would be on a standard APU just like the 2400g but on 7nm to allow more compute. We know from Radeon VII that Vega can clock way higher on 7nm for the same power so the GPU bit wants to be on 7nm, and because GPUs don't cache well it needs to be with the memory controllers. CPUs need to be on 7nm as well, so now I think it's just going to be a 7nm chip.

    Still find Radeon VII an odd name, but I guess "Vega 60" would sound like a step backwards. Tempted to call it that anyway

  16. #63
    Senior Member
    Join Date
    Dec 2013
    Posts
    2,947
    Thanks
    398
    Thanked
    364 times in 254 posts

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    Quote Originally Posted by DanceswithUnix View Post
    That is possible, graphics is going to be way more sensitive to being away from the memory controllers than CPU.....

    TBH having read that link my money now would be on a standard APU just like the 2400g but on 7nm to allow more compute. We know from Radeon VII that Vega can clock way higher on 7nm for the same power so the GPU bit wants to be on 7nm, and because GPUs don't cache well it needs to be with the memory controllers. CPUs need to be on 7nm as well, so now I think it's just going to be a 7nm chip.
    It maybe the way I've read that but it seems like you're saying GPUs are sensitive to latency, what with the distance and cache comments.

    With GPUs bandwidth is more important than latency so being away from the memory controller and a fast cache isn't as important as it would be for a CPU, you may want your memory close to a GPU because of signal strength (more traces equals thinner traces equal lower drive power equal shorter distances), in an ideal world AMD would use a GPU block, run traces through the substrate to a small block of HBM, and then connect that to an I/O die.

  17. #64
    Senior Member
    Join Date
    May 2015
    Posts
    261
    Thanks
    0
    Thanked
    5 times in 5 posts

    Re: AMD announces Radeon VII graphics: Zen 2 on track

    "It is likely that AMD will use more power for the same performance - 300W vs. 250W - and lack any forward-looking ray tracing support, but the Radeon VII deal is sweetened a little with the knowledge it will ship with codes for upcoming Devil May Cry 5, The Division 2 and Resident Evil 2 titles."

    I prefer more future proof tech, than free games, but maybe that's just me. Less watts too, and 50w (likely diff) 8hrs a day x 365 gaming is $19 a year (~.12, many pay far more for electricity, .2 in some states in USA, .24 even), so do the math on TCO for a card you'll use (for many ppl) for ~5-7yrs. Only the rich buy cards yearly, the rest of us probably live a little longer with $700+ cards. 5 years is a free $100 anyway (up to double that if .24 etc), so buy the 3 games yourself, and enjoy a better gaming experience on NV 2080 for ages after you buy it. Then again most don't even ponder TCO, so maybe some will buy this anyway. I can't wait to save $20-40 a year just for buying a 27-30in new monitor vs. my old dell 2407wfp-hc. It's like gsync for free for me as my Dell 24 is 11yrs old! Hard to believe a new 27in would essentially buy me a 2060 free if the new monitor last even 10yrs (summer watts higher here too ~.15 or so). It is easy to hit 40hrs a week gaming if you have a kid and you play too. I can put in 20 on a weekend if a great game hits, and another 20 during the week sometimes if time allows. Kids can dwarf this...LOL The monitor watts are the same gaming or browsing, vid watts only affect gaming or work (more time idle for most I'd guess).

    I'm thinking this is a tough sell at anything above $550, so they should have went with 8-12GB GDDR5x to cheapen things up. It's just not a 2080, period. Why they AGAIN chose HBM to screw their top card is beyond me. It does NOTHING, and 16GB isn't needed by anything, nor is HBM unless server. 2080ti proves you don't need this bandwidth either. Again, they just screwed themselves out of much profit, as most will just go NV for features+watts. If they had went with cheaper mem, I wouldn't be able to post this critique and they'd still make money. Most devs are not even aiming at 11GB yet.

    A 12Gb GDDR5x is all that was needed to enable $550-600 which would sell MUCH better against a card that will look far better over time with RT+DLSS and even VRS, all of which make things a much better experience for gamers (not to mention watt cost TCO). 16GB HBM+no new features won't win ~same price. I don't get why you would buy this as a gamer vs. 2080. Price drops will be tough for AMD here too, as they definitely don't have much room to drop with expensive 16GB on them. Are they trying to fail on purpose? I couldn't imagine trying to sell this vs. 2080. I think Jen's response was correct unfortunately (I own AMD stock...LOL) and I expect nothing to be released in response other than a new driver or something from NV if anything at all...Nothing scary here for NV Net income. Even with no monitor support still a tough sell, but that is over now with new NV drivers coming. Want your adaptive sync, you can have it now. Again, what is the selling point today, or tomorrow? This is a 1440p card at best (who likes turning stuff down? not what devs wanted), so again pointless to castrate your income with HBM.

    I can hear someone saying, but, but, but, it's a 4k card...LOL:
    NOTE: AMD gave the presentation showing Forza 4 horizon in 1080p at 100fps with this card. So even they are tacitly admitting it isn't even a 1440p card in their own demo. I agree and want all details on ALWAYS. It will take NV's BIG 7nm core coming, to hit 4K for real IMHO (700mm^2+ at 7nm that is, or two cards?) and 4K is used by <1.5% of 125million steam users. Who cares. If you're not hitting 60+ fps, you're going to end up under 30 mins too much for my taste (forget multiplayer at 30fps). I like AMD's idea, 100fps, or upgrade your card I won't be turning crap up or down in every game then just to get passable gameplay. I hate that crap. Just give me a card that does everything with EVERYTHING on, in well, EVERYTHING...LOL. I grit my teeth the second I have to downgrade settings to get my fun back. That isn't what the dev wanted me to see at that point right? When 7nm NV hits all cards will come with RTX features top to bottom, and that will mean 65mil sold next year with RTX stuff from NV. Devs are shooting there now, not later as they know what is coming. I hope AMD hurries with answers.

    That said, more happy on the cpu side of these announcements, as the watts are VERY good IMHO if true. That will sell easily vs. Intel if perf is the same in games (the part that held me up 2018) and better in many apps (already sold on). I can't wait for 12 core, which is what I need for the main PC, 8 cores good for my HTPC's. I also think they have more room to go above intel perf here. That's a LOT of watts to play with AMD has left right? I smell a sangbag here, and think AMD will launch better than equal perf, but I'm just guessing (that's a lot of watts off!). Not surprised though, we're talking a pretty good 7nm process (millions of apple chips shipped already) vs. Intel 14nm++. That is a lot of ground for Intel to make up and these numbers show it. I mean 45w is half the watts of 95w Intel procs. Again, I say that is a freaking lot of watts AMD has to run the score up. Even if it's half that, it's a lot to play with for US or AMD to just clock up out of the box. I suggest THEY do it and CHARGE for it. Put cores on TOP of Intel prices! I think wccftech guy was right here, it's probably 4.6ghz all core, and can probably easily hit 5ghz all core potentially for even more perf. Intel has a problem IMHO for 12-18 months at least until 7nm hits vs. tsmc 5nm (TSMC could screw that up, intel going much less aggressive for 7nm)

Page 4 of 5 FirstFirst 12345 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •