Read more.Principal member of AMD's technical staff has since 'corrected' his LinkedIn profile.
Read more.Principal member of AMD's technical staff has since 'corrected' his LinkedIn profile.
I think it's safe to say AMD want to move away from HBM as quickly as possible.
It's been a complete disaster for their GPU's.
Good technology but it just wasn't ready for mainstream in the quantity needed at the price required for the home GPU market, and ultimately, not much better than the latest GDDR offerings.
Last time I briefly looked up GPU, I saw a lot of posters expecting big thing from HBM2, some even saying that they are holding back for it. No longer then case then?
the problem of HBM is not the supply or fabrication tech but an issue of demand, the demand is just not good enough.
Vega has the same die size and memory bandwidth as a 1080ti, but not the same performance. The bandwidth is there though, so on the surface the HBM2 seems to be doing its bit, but it isn't a golden ticket.
I quite fancy a Vega 56, but can't find one at a sensible price. Sounds like price inflation through lack of supply to me. Am hoping custom cooler boards turn up soon.
Old puter - still good enuff till I save some pennies!
........Nvidia sabotaged HBM by not implementing the chips on high end GTX 10- series. Samsung developed lower bandwidth cheaper versions of HBM2 but up to date AMD,Intel or Nvidia has showed no interest.
I still say it's down to pricing due to a combination of demand and number of different production lines required currently.
When the industry is only producing 1 or 2 types of RAM, prices are low. At the moment we have a ton of different chips, fragmenting a market that is being bogged down by mobile devices.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
that is hilarious to be saying seeing as it was AMD that brought forth to market GDDR3-4-5 HBM-HBM2 (to the masses)
why would they get away from HBM2 (the most recent version) that allowed them to keep power and temperatures in check (much more so than GDDR5 or 5x would have allowed them)
The only reason why Ngreedia went the route of GDDR5 and 5x was they were ABLE to do so while keeping power and thermals in check (much less complex chip design)
either way, to "claim" they want to get away from HBM as fast as possible because it has been a complete disaster for their gpu is asinine to say the least. they had to "clock it down" to keep power and thermals in check, not that they could not run it as fast as the spec allowed it to be run at.
Was Raja that made the call to make Vega the way it was made, pretty sure AMD has to work its arse off to coincide to keep performance where they want it to be having to fight every step of the way with devs automatically giving their direct support to "present looking" designs that Nvidia throws on the shelf and fanboys buy hand over fist.
anyways.....HBM is a very smart "design choice" as it uses much less power at WAY higher speeds than GDDR is capable of, there are "kinks to work out" this has been proven of course, going along this line, might as well say that Ngreedia should be doing everything they possibly can to distance themselves from using shoddy component selection on their primary and secondary capacitor selection (85 and 105c instead of 110 and 125c like they should be using) but hey, what do I know, I just LOOOOVVVEEE pissing my hard earned money away on a massively overvaluated company (Ngreedia....Nvidia)
There are currently 1 users browsing this thread. (0 members and 1 guests)