Page 2 of 2 FirstFirst 12
Results 17 to 26 of 26

Thread: AMD Radeon RX 590 turns up in 3DMark database

  1. #17
    Registered+
    Join Date
    Mar 2017
    Posts
    48
    Thanks
    0
    Thanked
    2 times in 2 posts

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by edmundhonda View Post
    Quote Originally Posted by Zak33 View Post
    if it's going to be akin to a 1060, it needs to be power efficient too...
    The RX580 is a juicy old thing and a die shrink isn't going to perform miracles.

    Little surprised to see AMD still working on Polaris in the midrange a year after Vega launched.
    Vega on 14nm turned out to be too much of a power hog so AMD never pushed it into the mid-range as expected. (There wasn't much point if it couldn't get better power and speed numbers than Polaris.) So now they're doing a stop-gap to have something available to sell to that market until Navi is ready.

    So far as I can tell Vega on 7nm is never going to be a standalone mainstream product, though AMD is doing it as a machine learning card, probably mostly to get some experience with working with the 7nm process. It might happen as part of an APU if next year's replacement for the 2200G and 2400G goes that way rather than using Navi graphics.

  2. #18
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by Shirley Dulcey View Post
    Vega on 14nm turned out to be too much of a power hog
    The actual architecture isn't really a power hog, downclocked and undervolted it's actually pretty chill. nVidia just pulled Pascal rabbit out of its hat while they were developing it and it didn't perform as well as they hoped, so they had to clock the bejesus out of it to keep up with the GTX 1080. That's why there's hardly any overclocking headroom.

    Quote Originally Posted by Shirley Dulcey View Post
    so AMD never pushed it into the mid-range as expected.
    Who expected that? HBM2 was and is far too expensive to ever make it into a mid-range card.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  3. #19
    Senior Member
    Join Date
    May 2009
    Location
    Where you are not
    Posts
    1,330
    Thanks
    606
    Thanked
    103 times in 90 posts
    • Iota's system
      • Motherboard:
      • Asus Maximus Hero XI
      • CPU:
      • Intel Core i9 9900KF
      • Memory:
      • CMD32GX4M2C3200C16
      • Storage:
      • 1 x 1TB / 3 x 2TB Samsung 970 Evo Plus NVMe
      • Graphics card(s):
      • Nvidia RTX 3090 Founders Edition
      • PSU:
      • Corsair HX1200i
      • Case:
      • Corsair Obsidian 500D
      • Operating System:
      • Windows 10 Pro 64-bit
      • Monitor(s):
      • Samsung Odyssey G9
      • Internet:
      • 500Mbps BT FTTH

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by CAT-THE-FIFTH View Post
    For me I hope AMD stop going after the high end consumer market now. If they are to make big GPUs,make it specialised to the commercial markets first,since they can be integrated as part of the servers as a package and work with the vendors. When it comes to gaming I hope they stick to the midrange,and engineer something that can be sold profitable at that level,and can help keep them the consoles.

    In some ways we are seeing this already - the first 7NM GPU is a Vega one made for AI. Navi will be apparently more a value orientated GPU.

    Now,AMD is doing the keynote at CES 2019 though,so it could be quite possible they have something up their sleeves.
    Nvidia has done a LOT of groundwork though in terms of commercial applications for their GPU designs, they've gone to developers and actually asked them what they have wanted, that's gone on for years to even get to the stage they are at now. Unless AMD is prepared to sit down with developers (not talking gaming here) and ask what they want and build from the ground up for that over numerous iterations, they'll be chasing Nvidia for a long time to come.

    I honestly don't see them getting to that position in the next 5 years at least, Nvidia has been seriously laying some groundwork since what, prior to Fermi? Turing is just the latest iteration of this work, as you mention, they've found a way to go with one die area covering more than one aspect. Isn't that just going "jack of all trades" though? Like AMD?

  4. #20
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by Iota View Post
    Nvidia has done a LOT of groundwork though in terms of commercial applications for their GPU designs, they've gone to developers and actually asked them what they have wanted, that's gone on for years to even get to the stage they are at now. Unless AMD is prepared to sit down with developers (not talking gaming here) and ask what they want and build from the ground up for that over numerous iterations, they'll be chasing Nvidia for a long time to come.
    **Just a warning my comment is not in any order,but its more some of my thoughts on things.**

    Yes and no - for example have you ever wondered why AMD was so strong at OpenCL and Nvidia wasn't for years?? Apple. A whole lot of GCN based parts almost seem like they developed for Apple. Big Vega was first revealed as an AI focused product,not gaming either. I think the AMD move away from gaming per se,has been happening for a while now. It fits with what Raja Koduri said and the lack of R and D money due to Zen development made things worse.

    But funding Zen actually has made much more sense - even if AMD had something competitive to Turing right now,it wouldn't save the company. Why? People will just use AMD to make their Nvidia cards cheaper.

    Now look at APIs - who pioneered Mantle,which formed the basis of Vulkan....AMD. Guess what they worked on?? Consoles and consoles use low level APIs. We haven't seen low level APIs take off since Nvidia had no vested interest in them due to the way they changed things with Kepler. The fact is AMD pushed to new nodes before Nvidia did - HD4000,HD5000 and HD7000 series were examples. All had very competitive performance,but the fact is people bought Nvidia for whatever reasons. The same happened with ATI.

    This all cost money,and ultimately AMD has reorganised itself where it makes money - things like consoles,and embedded computing. Did you know that many of the cockpit displays of modern Airbus and Boeing commercial and tactical airlifters and airliners are powered by AMD GPUs?? But again its in concert with another company.

    This is the whole semi-custom strategy. So instead of just making solutions they work with third parties,who might help co-fund part of the R and D and help with the software stack. Baidu,MS,etc all examples of this:

    https://www.pcgamesn.com/microsoft-p...eaming-service

    See the whole noise about AMD "working" with MS and Baidu with Zen - these companies are large enough to help push forward the software side of things,which indirectly helps AMD.

    Many people look at Lisa Su,but Rory Read had an impact - he made AMD more aligned to provide solutions to end customers,which also has helped no doubt spread some of the R and D costs too,and the software side of things. Its how they managed to stay afloat. So that is where they are headed.

    It might not be so great for us though.

    The fact is for a company of its size,and its R and D spend,the fact its still afloat fighting two incumbants is nothing short of amazing. AMD is only slightly larger than Nvidia - even at its height AMD had 7 times less employees than Intel.


    Quote Originally Posted by Iota View Post
    I honestly don't see them getting to that position in the next 5 years at least, Nvidia has been seriously laying some groundwork since what, prior to Fermi? Turing is just the latest iteration of this work, as you mention, they've found a way to go with one die area covering more than one aspect. Isn't that just going "jack of all trades" though? Like AMD?
    The AMD designs did win them the consoles though as GCN is very compute focused general purpose design(like Fermi was),and unlike PC,console devs are fully exploiting this.

    That is also the problem - Fermi was poorly received in many ways. Kepler stripped out a lot of stuff like hardware scheduling,etc to cut down on power usage and manufacturing costs and by extension that lead to the new generation APIs being held back on PC. Maxwell further pushed this - don't you think it is utterly weird the R9 290/390 are still competitive cards especially with newer APIs??

    AMD made "jack of all trades" designs due to cost. Now Nvidia is obviously hitting the same issue,so now they are trying to re-unify everything. Its not surprising considering how much it costs to tape out new chips TBH.

    Fermi was in some ways similar - lots of functionality which didn't do much for gamers. Even the vaunted tessellation throughput was more a kludge to try and use some of that functionality.

    ATI had very focused designs suited for gaming workloads. So the whole Maxwell/Pascal thing against the GCN cards,was not surprising as it was a reversal.Nvidia had another line which gamers never used.

    Now,look what has happened with Turing - huge dies,high cost and people not happy about it. Look at this logically - imagine if Nvidia had made another FP32 focused line of cards on 12NM with similar chip sizes?? The performance bump would be so massive,I doubt many would be complaining as much about pricing since every tier would see much bigger than normal gains. AMD would have no chance even on 7NM.

    But instead we have lots of the GPU not being used yet,and the software stack is hardly there with regards to game,certainly at launch. It makes no sense if Nvidia had developed these for gaming first,as the devs would already have the cards in hand,and there would have at least been one playable game demo at launch. For instance if Nvidia were that worried about RT performance now,why sell cards with hardly any RT support at launch for consumers?? Why not wait until 7NM?? Even devs seem to be only getting the hardware now,and hardly had a chance to use it. Its all last minute which is unlike Nvidia TBH.

    They obviously makes more sense they want to sell this for VFX and AI first,but they will make sure gamers will subsidise mass production of these chips first,and people will pay the prices. All the RT and DLSS stuff is a kludge to find some way to use large parts of the chips for games to "sell value".

    The difference is unlike Fermi where Nvidia was willing to eat margins(even AMD did the same) as they knew computer buyers were more critical 10 years ago,however due to "modern consumers" who justified all the stupid price increases over the last few years,they are willing to see if they can keep their historically record high margins. It will work. Nvidia margins are more than Intel,and Intel margins have been record high the last 6 years AFAIK. They have also realised AMD has pretty much realised they might as well not bother now,so why not??

    Have you noticed every time Nvidia has tried to move back to more general purpose GPUs it has lead to massive chips,pricing issues,etc?? Everytime they did that was not due to gamers needs,but other areas.

    Intel did the same thing - they milked prices for years,yet spent billions on subsidising Atom for normal people. Nvidia did the same with Tegra - all those expensive prices were funding their forays elsewhere(Tegra lost $100s of millions).Desktop users,gamers,etc were a cash cow.

    I also expect as time progresses,they will try their best to move away from using FP32 for games,and try and kludge the tensor and RT cores to somehow do most of this work. This will have the effect of making your older FP32 focused cards age even quicker. Soon it won't be optional,it will be what is required. For commercial RT and AI stuff,OFC increasing the amount of cores dedicated for that will help a lot as its easy performance gains,so I see future Nvidia "gaming" GPUs being increasingly driven by factors outside gaming.
    Last edited by CAT-THE-FIFTH; 16-10-2018 at 12:29 AM.

  5. #21
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Gateshead
    Posts
    15,196
    Thanks
    1,231
    Thanked
    2,291 times in 1,874 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 2x 4GB DDR4 2666
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by The Article"
    ... According to various sources, the RX 670 was due sometime this weekend (but could still be launched today) ...
    Given that Hexus know (potentially weeks) in advance (through NDAs and advance testing) when new hardware is due to launch, I do have to wonder how often the writers have to pause to pick themselves up off the floor when writing things like this...

  6. #22
    Senior Member
    Join Date
    Mar 2005
    Posts
    4,932
    Thanks
    171
    Thanked
    383 times in 310 posts
    • badass's system
      • Motherboard:
      • ASUS P8Z77-m pro
      • CPU:
      • Core i5 3570K
      • Memory:
      • 32GB
      • Storage:
      • 1TB Samsung 850 EVO, 2TB WD Green
      • Graphics card(s):
      • Radeon RX 580
      • PSU:
      • Corsair HX520W
      • Case:
      • Silverstone SG02-F
      • Operating System:
      • Windows 10 X64
      • Monitor(s):
      • Del U2311, LG226WTQ
      • Internet:
      • 80/20 FTTC

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by aidanjt View Post
    Who expected that? HBM2 was and is far too expensive to ever make it into a mid-range card.
    Indeed. IIRC every gaming Vega was sold at a loss to by AMD. Entirely due to the HBM.
    "In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."

  7. #23
    Long member
    Join Date
    Apr 2008
    Posts
    2,427
    Thanks
    70
    Thanked
    404 times in 291 posts
    • philehidiot's system
      • Motherboard:
      • Father's bored
      • CPU:
      • Cockroach brain V0.1
      • Memory:
      • Innebriated, unwritten
      • Storage:
      • Big Yellow Self Storage
      • Graphics card(s):
      • Semi chewed Crayola Mega Pack
      • PSU:
      • 20KW single phase direct grid supply
      • Case:
      • Closed, Open, Cold
      • Operating System:
      • Cockroach
      • Monitor(s):
      • The mental health nurses
      • Internet:
      • Please.

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by badass View Post
    Indeed. IIRC every gaming Vega was sold at a loss to by AMD. Entirely due to the HBM.
    What's annoying is that my 8GB of HBM is getting very close to full in COD WW2. Makes me wonder if they'd have been better off using GDDR5 and putting more on. There's also the possibility that the game uses as much as it can regardless and it just looks like it's getting close to the limit when it isn't.

  8. #24
    Member
    Join Date
    Dec 2017
    Location
    london
    Posts
    134
    Thanks
    0
    Thanked
    2 times in 2 posts
    • persimmon's system
      • CPU:
      • n3455 8600k
      • Memory:
      • 8gb 16gb
      • Storage:
      • 12tb 2.5tb
      • Graphics card(s):
      • uhd500 gtx1070
      • PSU:
      • DC 750w thorium

    Re: AMD Radeon RX 590 turns up in 3DMark database

    AMD have produced the FASTEST dang steam train available on the tracks . (steam=games, geddit?)
    Has anyone heard any rumors about a vega 32 , cut down vega part. Shame about amd and power , as they would be my goto , but single six pin limits me to 1060 ....

  9. #25
    Orbiting The Hand's Avatar
    Join Date
    Mar 2004
    Location
    Lincoln, UK
    Posts
    1,580
    Thanks
    170
    Thanked
    96 times in 73 posts
    • The Hand's system
      • Motherboard:
      • Gigabyte AB350 Gaming-3
      • CPU:
      • AMD Ryzen 5 2400G
      • Memory:
      • 16GB Patriot Viper DDR4 3200mhz (8GBx2)
      • Storage:
      • 2TB Kingston SSD
      • Graphics card(s):
      • Asus Geforce RTX 2060 Super 8GB Dual Series
      • PSU:
      • Corsair HX 520 Modular
      • Case:
      • Coolermaster Praetorian
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Sony 32 inch HD TV
      • Internet:
      • 20Mbps Fibre

    Re: AMD Radeon RX 590 turns up in 3DMark database

    There could be a bigger market for this RX 590 card, if Nvidia keep releasing drivers that actually slow down their 1060 cards

    Last edited by The Hand; 18-10-2018 at 02:15 PM.

  10. #26
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,978
    Thanks
    778
    Thanked
    1,586 times in 1,341 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD Radeon RX 590 turns up in 3DMark database

    Quote Originally Posted by philehidiot View Post
    There's also the possibility that the game uses as much as it can regardless and it just looks like it's getting close to the limit when it isn't.
    I believe that is what is actually happening. It sounds like only a small amount of your 8GB is wasted, so they put the right amount on.

    Quote Originally Posted by persimmon View Post
    Shame about amd and power , as they would be my goto , but single six pin limits me to 1060 ....
    There are RX 570 cards out there with a single 6 pin, if that is good enough.

    https://www.scan.co.uk/products/4gb-...r5-dp-hdmi-dvi

    Or can you use a sata power to 6 or 8 pin adapter? That's what my son has to use in his Dell workstation which only has a single 6 pin power yet plenty of grunt in the PSU.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •