Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
edmundhonda
Quote:
Originally Posted by
Zak33
if it's going to be akin to a 1060, it needs to be power efficient too...
The RX580 is a juicy old thing and a die shrink isn't going to perform miracles.
Little surprised to see AMD still working on Polaris in the midrange a year after Vega launched.
Vega on 14nm turned out to be too much of a power hog so AMD never pushed it into the mid-range as expected. (There wasn't much point if it couldn't get better power and speed numbers than Polaris.) So now they're doing a stop-gap to have something available to sell to that market until Navi is ready.
So far as I can tell Vega on 7nm is never going to be a standalone mainstream product, though AMD is doing it as a machine learning card, probably mostly to get some experience with working with the 7nm process. It might happen as part of an APU if next year's replacement for the 2200G and 2400G goes that way rather than using Navi graphics.
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
Shirley Dulcey
Vega on 14nm turned out to be too much of a power hog
The actual architecture isn't really a power hog, downclocked and undervolted it's actually pretty chill. nVidia just pulled Pascal rabbit out of its hat while they were developing it and it didn't perform as well as they hoped, so they had to clock the bejesus out of it to keep up with the GTX 1080. That's why there's hardly any overclocking headroom.
Quote:
Originally Posted by
Shirley Dulcey
so AMD never pushed it into the mid-range as expected.
Who expected that? HBM2 was and is far too expensive to ever make it into a mid-range card.
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
CAT-THE-FIFTH
For me I hope AMD stop going after the high end consumer market now. If they are to make big GPUs,make it specialised to the commercial markets first,since they can be integrated as part of the servers as a package and work with the vendors. When it comes to gaming I hope they stick to the midrange,and engineer something that can be sold profitable at that level,and can help keep them the consoles.
In some ways we are seeing this already - the first 7NM GPU is a Vega one made for AI. Navi will be apparently more a value orientated GPU.
Now,AMD is doing the keynote at CES 2019 though,so it could be quite possible they have something up their sleeves.
Nvidia has done a LOT of groundwork though in terms of commercial applications for their GPU designs, they've gone to developers and actually asked them what they have wanted, that's gone on for years to even get to the stage they are at now. Unless AMD is prepared to sit down with developers (not talking gaming here) and ask what they want and build from the ground up for that over numerous iterations, they'll be chasing Nvidia for a long time to come.
I honestly don't see them getting to that position in the next 5 years at least, Nvidia has been seriously laying some groundwork since what, prior to Fermi? Turing is just the latest iteration of this work, as you mention, they've found a way to go with one die area covering more than one aspect. Isn't that just going "jack of all trades" though? Like AMD?
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
Iota
Nvidia has done a LOT of groundwork though in terms of commercial applications for their GPU designs, they've gone to developers and actually asked them what they have wanted, that's gone on for years to even get to the stage they are at now. Unless AMD is prepared to sit down with developers (not talking gaming here) and ask what they want and build from the ground up for that over numerous iterations, they'll be chasing Nvidia for a long time to come.
**Just a warning my comment is not in any order,but its more some of my thoughts on things.**
Yes and no - for example have you ever wondered why AMD was so strong at OpenCL and Nvidia wasn't for years?? Apple. A whole lot of GCN based parts almost seem like they developed for Apple. Big Vega was first revealed as an AI focused product,not gaming either. I think the AMD move away from gaming per se,has been happening for a while now. It fits with what Raja Koduri said and the lack of R and D money due to Zen development made things worse.
But funding Zen actually has made much more sense - even if AMD had something competitive to Turing right now,it wouldn't save the company. Why? People will just use AMD to make their Nvidia cards cheaper.
Now look at APIs - who pioneered Mantle,which formed the basis of Vulkan....AMD. Guess what they worked on?? Consoles and consoles use low level APIs. We haven't seen low level APIs take off since Nvidia had no vested interest in them due to the way they changed things with Kepler. The fact is AMD pushed to new nodes before Nvidia did - HD4000,HD5000 and HD7000 series were examples. All had very competitive performance,but the fact is people bought Nvidia for whatever reasons. The same happened with ATI.
This all cost money,and ultimately AMD has reorganised itself where it makes money - things like consoles,and embedded computing. Did you know that many of the cockpit displays of modern Airbus and Boeing commercial and tactical airlifters and airliners are powered by AMD GPUs?? But again its in concert with another company.
This is the whole semi-custom strategy. So instead of just making solutions they work with third parties,who might help co-fund part of the R and D and help with the software stack. Baidu,MS,etc all examples of this:
https://www.pcgamesn.com/microsoft-p...eaming-service
See the whole noise about AMD "working" with MS and Baidu with Zen - these companies are large enough to help push forward the software side of things,which indirectly helps AMD.
Many people look at Lisa Su,but Rory Read had an impact - he made AMD more aligned to provide solutions to end customers,which also has helped no doubt spread some of the R and D costs too,and the software side of things. Its how they managed to stay afloat. So that is where they are headed.
It might not be so great for us though.
The fact is for a company of its size,and its R and D spend,the fact its still afloat fighting two incumbants is nothing short of amazing. AMD is only slightly larger than Nvidia - even at its height AMD had 7 times less employees than Intel.
Quote:
Originally Posted by
Iota
I honestly don't see them getting to that position in the next 5 years at least, Nvidia has been seriously laying some groundwork since what, prior to Fermi? Turing is just the latest iteration of this work, as you mention, they've found a way to go with one die area covering more than one aspect. Isn't that just going "jack of all trades" though? Like AMD?
The AMD designs did win them the consoles though as GCN is very compute focused general purpose design(like Fermi was),and unlike PC,console devs are fully exploiting this.
That is also the problem - Fermi was poorly received in many ways. Kepler stripped out a lot of stuff like hardware scheduling,etc to cut down on power usage and manufacturing costs and by extension that lead to the new generation APIs being held back on PC. Maxwell further pushed this - don't you think it is utterly weird the R9 290/390 are still competitive cards especially with newer APIs??
AMD made "jack of all trades" designs due to cost. Now Nvidia is obviously hitting the same issue,so now they are trying to re-unify everything. Its not surprising considering how much it costs to tape out new chips TBH.
Fermi was in some ways similar - lots of functionality which didn't do much for gamers. Even the vaunted tessellation throughput was more a kludge to try and use some of that functionality.
ATI had very focused designs suited for gaming workloads. So the whole Maxwell/Pascal thing against the GCN cards,was not surprising as it was a reversal.Nvidia had another line which gamers never used.
Now,look what has happened with Turing - huge dies,high cost and people not happy about it. Look at this logically - imagine if Nvidia had made another FP32 focused line of cards on 12NM with similar chip sizes?? The performance bump would be so massive,I doubt many would be complaining as much about pricing since every tier would see much bigger than normal gains. AMD would have no chance even on 7NM.
But instead we have lots of the GPU not being used yet,and the software stack is hardly there with regards to game,certainly at launch. It makes no sense if Nvidia had developed these for gaming first,as the devs would already have the cards in hand,and there would have at least been one playable game demo at launch. For instance if Nvidia were that worried about RT performance now,why sell cards with hardly any RT support at launch for consumers?? Why not wait until 7NM?? Even devs seem to be only getting the hardware now,and hardly had a chance to use it. Its all last minute which is unlike Nvidia TBH.
They obviously makes more sense they want to sell this for VFX and AI first,but they will make sure gamers will subsidise mass production of these chips first,and people will pay the prices. All the RT and DLSS stuff is a kludge to find some way to use large parts of the chips for games to "sell value".
The difference is unlike Fermi where Nvidia was willing to eat margins(even AMD did the same) as they knew computer buyers were more critical 10 years ago,however due to "modern consumers" who justified all the stupid price increases over the last few years,they are willing to see if they can keep their historically record high margins. It will work. Nvidia margins are more than Intel,and Intel margins have been record high the last 6 years AFAIK. They have also realised AMD has pretty much realised they might as well not bother now,so why not??
Have you noticed every time Nvidia has tried to move back to more general purpose GPUs it has lead to massive chips,pricing issues,etc?? Everytime they did that was not due to gamers needs,but other areas.
Intel did the same thing - they milked prices for years,yet spent billions on subsidising Atom for normal people. Nvidia did the same with Tegra - all those expensive prices were funding their forays elsewhere(Tegra lost $100s of millions).Desktop users,gamers,etc were a cash cow.
I also expect as time progresses,they will try their best to move away from using FP32 for games,and try and kludge the tensor and RT cores to somehow do most of this work. This will have the effect of making your older FP32 focused cards age even quicker. Soon it won't be optional,it will be what is required. For commercial RT and AI stuff,OFC increasing the amount of cores dedicated for that will help a lot as its easy performance gains,so I see future Nvidia "gaming" GPUs being increasingly driven by factors outside gaming.
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by The Article"
... According to various sources, the RX 670 was due sometime this weekend (but could still be launched today) ...
Given that Hexus know (potentially weeks) in advance (through NDAs and advance testing) when new hardware is due to launch, I do have to wonder how often the writers have to pause to pick themselves up off the floor when writing things like this...
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
aidanjt
Who expected that? HBM2 was and is far too expensive to ever make it into a mid-range card.
Indeed. IIRC every gaming Vega was sold at a loss to by AMD. Entirely due to the HBM.
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
badass
Indeed. IIRC every gaming Vega was sold at a loss to by AMD. Entirely due to the HBM.
What's annoying is that my 8GB of HBM is getting very close to full in COD WW2. Makes me wonder if they'd have been better off using GDDR5 and putting more on. There's also the possibility that the game uses as much as it can regardless and it just looks like it's getting close to the limit when it isn't.
Re: AMD Radeon RX 590 turns up in 3DMark database
AMD have produced the FASTEST dang steam train available on the tracks . (steam=games, geddit?)
Has anyone heard any rumors about a vega 32 , cut down vega part. Shame about amd and power , as they would be my goto , but single six pin limits me to 1060 ....
Re: AMD Radeon RX 590 turns up in 3DMark database
There could be a bigger market for this RX 590 card, if Nvidia keep releasing drivers that actually slow down their 1060 cards :O_o1:
https://www.youtube.com/watch?v=mFSLro_OXLQ
Re: AMD Radeon RX 590 turns up in 3DMark database
Quote:
Originally Posted by
philehidiot
There's also the possibility that the game uses as much as it can regardless and it just looks like it's getting close to the limit when it isn't.
I believe that is what is actually happening. It sounds like only a small amount of your 8GB is wasted, so they put the right amount on.
Quote:
Originally Posted by
persimmon
Shame about amd and power , as they would be my goto , but single six pin limits me to 1060 ....
There are RX 570 cards out there with a single 6 pin, if that is good enough.
https://www.scan.co.uk/products/4gb-...r5-dp-hdmi-dvi
Or can you use a sata power to 6 or 8 pin adapter? That's what my son has to use in his Dell workstation which only has a single 6 pin power yet plenty of grunt in the PSU.