Page 2 of 2 FirstFirst 12
Results 17 to 27 of 27

Thread: AMD Radeon VII

  1. #17
    Member
    Join Date
    Jun 2008
    Posts
    109
    Thanks
    25
    Thanked
    7 times in 6 posts

    Re: AMD Radeon VII

    Quote Originally Posted by cheesemp View Post
    To try and ensure Nvidia don't own the graphics card market? I deliberately chose an rx480 vs a 1060 for this very reason (and on the suspicion it would age better as AMD card usually do). Gaming on a PC is already stupidly expensive and nvidia having a free hand is something I'd pay to prevent.
    Same here. I have bought the RX 580 and now the Vega 64 for exactly that reason. To allow NVidia to monopolise the market is just asking for trouble in the future. The RTX lines already proves that they will try to gouge whatever they can, while they can.

    Having competition is a good thing, just look at Ryzen CPUs (yup, bought a Ryzen 1 and Threadripper 1920X), and definitely happy with them.

  2. #18
    Senior Member
    Join Date
    May 2015
    Posts
    359
    Thanks
    0
    Thanked
    7 times in 7 posts

    Re: AMD Radeon VII

    "Radeon VII smashes the Vega 64 and, for the first time, beats the RTX 2080 FE at the two important resolutions."
    Uh, 4k+1400p on steam hardware survey (125mil users) is <5% total. If you take that to 4K, it is <1.4%. How important is 1.4% of gamers? How important are BOTH if <5% COMBINED? Not important at all. PERIOD. Maybe in a few years you can claim this and people don't upgrade monitors like gpus/cpus. IE, mine are both over 10yrs old (and a 3rd in a box ready to go...LOL - tester for builds now) and my next one won't be 4K unless a 32in+ model comes out with Gsync (assuming I choose 7nm NV next) and I'd really want above 32in for 4k, but that might make fitting 2 on my desk a problem, so again, likely 1600p/1440p 32 or less and I'll probably keep that for a decade too...ROFL. No hope for 4k for me for years, and I can easily afford it (I could buy a $1000 monitor monthly if desired, just not worth it for me, and a card with it monthly...LOL). My eye is on the 1600p dell for $1100, so money isn't the issue & it's not even gsync/freesync! WTH dell? Get with it! Put Gsync/Freesync in that monitor and my money is yours today Er, Gsync at that price

    Unfortunately for AMD they went HBM again, thus making this cart NOT a good buy and NOT a money maker for AMD thus 1500 made? AMD says otherwise, but I think they lose money on every one, so close to correct here. They require a bundle to make a dime on HBM cards unless PRO/server versions. 16GB just made things even worse with already pricey mem.

    If they could have made this card with 8GB GDDR6/GDDR5x they could have priced it at $500 and made more probably (a LOT more as MORE would sell at that price vs. 2080 and wouldn't need a game bundle to make NET INCOME). Heck they could have sold two versions, maybe a 8GB and 12GB or something. Instead 16GB not used by anyone and devs only aiming at 11GB tops (and that only coming soon really), while AGAIN having to claim 4K crap with only 1.4% of people using it. Too bad AMD wasted another gpu launch, though they likely didn't have much choice given the chip they used I guess (not enough R&D to fight multiple wars with NV/Intel). It is amazing AMD does this well really. Not saying the card is bad, just that if you don't make money, who cares. My point here is about AMD and their NET income, not how much WE like the cards. It can be the baddest card on the planet, but if you can't make a dime on it, who cares? It won't make a dime for future R&D so wasted R&D to make it.

    AMD shouldn't waste R&D on anything that doesn't make 50%+ margins unless it's to salvage parts to improve profits that would be lost for new products. IE, consoles are a waste of R&D as shown by Q reports (shorter cycles now too, so again, massive wasted R&D for plucky little AMD). $10-15 income per chip is stupid if it's selling 10-20mil total per year (ps4/xbox1 socs are ~$95-110 each). It's dumb at 20-30mil per year and AMD has to make TWO of them (sony/msft). How good would AMD cpu/gpu be if this R&D was directed at those instead of console/apu? We would have seen a PROPER answer to 20xx series and AMD wouldn't be launching low end 7nm I suspect (pointless for net income as the little guy, aim higher for margins!). AMD needs 60% margins (like Intel/NV), not 30-40%. Aim at RICH people, not POOR. Serve the poor only when forced or you are selling freaking tons of units Sony just hit 91mil in over 5yrs (2x faster than ps3/xbox360 but still sucks for AMD), but that isn't even 20mil a year and MSFT is worse. Even if both sold 20mil each yearly that is only 400mil and not including R&D lost on each chip. NV gpus bring in a BILLION a quarter now, and they PASSED on consoles and even dropped mobile (killed modem too, they focus and dump dead weight quick). I'd rather be NV/GPU than AMD+consoles+cpu+gpu+apu. Make kings, not jack of all trade good at nothing stuff. FOCUS AMD, focus! The Dark Mayor was right in 2011 and AMD fired him for it & are now doing what he said on cpu anyway...ROFL... I still own the stock though...LOL. They can't screw up 7nm server pricing that bad as they have MARGIN even at idiot pricing!

  3. #19
    Registered+
    Join Date
    May 2017
    Posts
    38
    Thanks
    0
    Thanked
    3 times in 3 posts

    Re: AMD Radeon VII

    Quote Originally Posted by Tunnah View Post
    Loud, hot, and power thirsty might as well be the AMD GPU slogan at this point. I don't get why anyone would pick this over a 2080 or a 1080Ti
    Where can you get a new 1080ti for the price of a RVII? The 2080 is around the same price for the bottom of the barrel to +£200 for the high end variants. The one point that would make them more than viable is if you had a freesync monitor, but thanks to Nvidia's foresight, they've negated that bonus for AMD.

    I suppose it would come down to whether or not you prefer one or the other. On some stuff it'll outperform the 2080, but I'm guessing those will be the few AMD assisted games; and whether you think DLSS/Raytracing will actually be a thing.


    I honestly don't think its a bad effort and if I wasn't waiting for PCI 4 for my next build, I'd really consider picking one up. It is a shame that they literally can't make it any cheaper to manufacture.

  4. #20
    Registered User
    Join Date
    Sep 2018
    Posts
    2
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: AMD Radeon VII

    Here is something I've picked out from SOTTR. Spend £200 on a RX580 and play at 1080 with an average FPS of 60 or spend £1100 on a RTX2080ti and play at 4K with an average 59 FPS.

  5. #21
    Registered+
    Join Date
    Jan 2019
    Posts
    37
    Thanks
    0
    Thanked
    3 times in 2 posts

    Re: AMD Radeon VII

    For my use case (gaming, no content creation) this is not a great idea. Same performance as a 2 year old 1080 TI, the same price as a 2080, whilst sounding like a jet engine under full load? My GTX 780 has been dying and I held off to see if this would be the one to replace it but no. I've bought an EVGA RTX 2080 XC Gaming for £650 and moving to 1440p 144Hz.

  6. #22
    Senior Member
    Join Date
    Apr 2016
    Posts
    772
    Thanks
    0
    Thanked
    9 times in 9 posts

    Re: AMD Radeon VII

    No matter though... 2080 and the Nvidia card still struggle to keep above minimum 60 FPS in any games in 4K hence not really getting past 4K myth still waiting for it when its standard to get 85 FPS at least any game without any struggle and would say it is long past due for that as well... it is just my personal opinion that is.

  7. #23
    Member
    Join Date
    Feb 2015
    Posts
    122
    Thanks
    12
    Thanked
    2 times in 2 posts

    Re: AMD Radeon VII

    So, does Vega20 (used in the VII) have the hardware fixes for the Primitive Shaders hardware issue that the Vega 56/64 have that couldn't be solved via software?

  8. #24
    Member
    Join Date
    Mar 2017
    Posts
    115
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: AMD Radeon VII

    Radeon VII should have been £50 cheaper then would be a good value...
    ide pick the 2080 over the Radeon VII any day....

  9. #25
    Administrator MLyons's Avatar
    Join Date
    Feb 2017
    Posts
    474
    Thanks
    310
    Thanked
    156 times in 92 posts
    • MLyons's system
      • Motherboard:
      • ASUS PRIME X470-PRO
      • CPU:
      • 2700x
      • Memory:
      • 16GB DDR4 Corsair RGB
      • Storage:
      • 500GB MX500 500GB HDD 2TB SSD
      • Graphics card(s):
      • EVGA SC2 1080Ti
      • PSU:
      • Corsair tx650
      • Case:
      • Corsair Air 540
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 2 Asus 1080p

    Re: AMD Radeon VII

    Quote Originally Posted by nobodyspecial View Post
    "Radeon VII smashes the Vega 64 and, for the first time, beats the RTX 2080 FE at the two important resolutions."
    Uh, 4k+1400p on steam hardware survey (125mil users) is <5% total. If you take that to 4K, it is <1.4%. How important is 1.4% of gamers? How important are BOTH if <5% COMBINED? Not important at all. PERIOD. Maybe in a few years you can claim this and people don't upgrade monitors like gpus/cpus. IE, mine are both over 10yrs old (and a 3rd in a box ready to go...LOL - tester for builds now) and my next one won't be 4K unless a 32in+ model comes out with Gsync (assuming I choose 7nm NV next) and I'd really want above 32in for 4k, but that might make fitting 2 on my desk a problem, so again, likely 1600p/1440p 32 or less and I'll probably keep that for a decade too...ROFL. No hope for 4k for me for years, and I can easily afford it (I could buy a $1000 monitor monthly if desired, just not worth it for me, and a card with it monthly...LOL). My eye is on the 1600p dell for $1100, so money isn't the issue & it's not even gsync/freesync! WTH dell? Get with it! Put Gsync/Freesync in that monitor and my money is yours today Er, Gsync at that price

    Unfortunately for AMD they went HBM again, thus making this cart NOT a good buy and NOT a money maker for AMD thus 1500 made? AMD says otherwise, but I think they lose money on every one, so close to correct here. They require a bundle to make a dime on HBM cards unless PRO/server versions. 16GB just made things even worse with already pricey mem.

    If they could have made this card with 8GB GDDR6/GDDR5x they could have priced it at $500 and made more probably (a LOT more as MORE would sell at that price vs. 2080 and wouldn't need a game bundle to make NET INCOME). Heck they could have sold two versions, maybe a 8GB and 12GB or something. Instead 16GB not used by anyone and devs only aiming at 11GB tops (and that only coming soon really), while AGAIN having to claim 4K crap with only 1.4% of people using it. Too bad AMD wasted another gpu launch, though they likely didn't have much choice given the chip they used I guess (not enough R&D to fight multiple wars with NV/Intel). It is amazing AMD does this well really. Not saying the card is bad, just that if you don't make money, who cares. My point here is about AMD and their NET income, not how much WE like the cards. It can be the baddest card on the planet, but if you can't make a dime on it, who cares? It won't make a dime for future R&D so wasted R&D to make it.

    AMD shouldn't waste R&D on anything that doesn't make 50%+ margins unless it's to salvage parts to improve profits that would be lost for new products. IE, consoles are a waste of R&D as shown by Q reports (shorter cycles now too, so again, massive wasted R&D for plucky little AMD). $10-15 income per chip is stupid if it's selling 10-20mil total per year (ps4/xbox1 socs are ~$95-110 each). It's dumb at 20-30mil per year and AMD has to make TWO of them (sony/msft). How good would AMD cpu/gpu be if this R&D was directed at those instead of console/apu? We would have seen a PROPER answer to 20xx series and AMD wouldn't be launching low end 7nm I suspect (pointless for net income as the little guy, aim higher for margins!). AMD needs 60% margins (like Intel/NV), not 30-40%. Aim at RICH people, not POOR. Serve the poor only when forced or you are selling freaking tons of units Sony just hit 91mil in over 5yrs (2x faster than ps3/xbox360 but still sucks for AMD), but that isn't even 20mil a year and MSFT is worse. Even if both sold 20mil each yearly that is only 400mil and not including R&D lost on each chip. NV gpus bring in a BILLION a quarter now, and they PASSED on consoles and even dropped mobile (killed modem too, they focus and dump dead weight quick). I'd rather be NV/GPU than AMD+consoles+cpu+gpu+apu. Make kings, not jack of all trade good at nothing stuff. FOCUS AMD, focus! The Dark Mayor was right in 2011 and AMD fired him for it & are now doing what he said on cpu anyway...ROFL... I still own the stock though...LOL. They can't screw up 7nm server pricing that bad as they have MARGIN even at idiot pricing!
    Because it's a high end card. If I'm spending £700 on a card I'm going to be using it at 4k or at the bare minimum 1440p. It's also not a rich vs poor thing. That's a stupid argument. It's about what they want to spend on their machine and what they want out of it.
    Half dev, Half doge. Some say DevDoge

    Feel free to message me if you find any bugs or have any suggestions.
    If you need me urgently, PM me
    If something is/was broke it was probably me. ¯\_(ツ)_/¯

  10. #26
    Registered User
    Join Date
    Feb 2019
    Posts
    9
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: AMD Radeon VII

    Guess I'll stick with NVIDIA for my upcoming PC build then.

  11. #27
    Registered+
    Join Date
    Oct 2017
    Posts
    28
    Thanks
    0
    Thanked
    0 times in 0 posts
    • LyntonB's system
      • Motherboard:
      • x470 Taichi
      • CPU:
      • 5800x
      • Memory:
      • Corsair Dominator Platinum RGB 32GB (2x16GB) DDR4 3200MHz CL14 CMT32GX4M2C3200C14
      • Graphics card(s):
      • Nvidia 3080
      • PSU:
      • Corsair HX1200i
      • Operating System:
      • Win10 64bit
      • Monitor(s):
      • Samsung Odyssey G7 32"

    Re: AMD Radeon VII

    Quote Originally Posted by Tunnah View Post
    Loud, hot, and power thirsty might as well be the AMD GPU slogan at this point. I don't get why anyone would pick this over a 2080 or a 1080Ti
    look for 1440p gaming Division 2 on Vega 7 youtube - 12-13gb VRAM utilisation. Also overclocking now enabled for Vega 7 in Wattman, awaiting new round of benchmarks and reviews. oh yeah and Sapphire Vega 7 £100 cheaper than 2080FE..

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •