Read more.Mike Rayfield is SVP and general manager, David Wang becomes SVP graphics engineering.
Read more.Mike Rayfield is SVP and general manager, David Wang becomes SVP graphics engineering.
Saw this elsewhere yesterday and wondered why it wasn't on Hexus yet...
Lets hope their first actions aren't to start making mining cards...
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
2 > 1?
1 > Raja?
There's a reasonable argument to be made that heading up RTG needed more than one person, given the circumstances. Splitting the general management from the technical and engineering management should allow a better focus in both areas. My only real concern is that both are responsible for 'strategy', and you really need the business strategy and technical strategy to be fully aligned if you're going to make successful products.
Of course, the next couple of years is just going to be a case of delivering the existing roadmap as smoothly as possible - it'll be a while before we see the direction the new management will take RTG in....
In the AT article about this,it seems semi-custom is now part of RTG and RTG will also get more funding too:
https://www.anandtech.com/show/12363...new-leadership
I don't see what. The thing that makes a card good for gaming is the thing that also makes it good for mining; being able to process lots of parallel vector arithmetic. That's why pretty much all GPUs have been hit by mining demand.
I'd much rather see AMD just make as good GPUs as possible than split their already stretched engineering and financial resources just to cater for the mining community. It's [b]expensive[/i] to produce dedicated silicon...
Half dev, Half doge. Some say DevDoge
Feel free to message me if you find any bugs or have any suggestions.
If you need me urgently, PM me
If something is/was broke it was probably me. ¯\_(ツ)_/¯
Why would AMD conceive "gimping" (for a lack of a better word) their GPU technology so that a specific use case that generates them a lot of revenue can no longer be done.
Frankly they did the best thing by using lower binned silicon and removed the output components and classed them as "mining cards".
The news story about PowerVR's new GPU design did trigger one thought - since perf/watt is potentially the key factor if you could make a GPU that produce 1/4 of the hash rate of an RX550 but at /10th the power. No real interest to gamers, but you slap four of them and a PCIe switch on an x1 card to sell to miners.
That's only true to the extent that the mining cards are literally just rehashed full GPU designs. If you could provide the same hashing performance at either a fraction of the price or a fraction of the power draw (and ideally, both), the resale value would become less important. For many of the popular mining cards running them 24/7 for 2 years costs as much in electricity as buying the card up front (not to mention the cost of running the underlying server). It's almost unheard of to get much more than half your initial investment back on a second hand card. If an alternative card comes along, at the same price, but costs less than half as much in electricity over its lifespan, it's going to be attractive. If a card comes along that uses the same power but costs half as much up-front, it's going to be attractive. That's why people eventually moved to ASICs for Bitcoin - they offered a better performance:cost ratio than slogging away with GPUs. All it needs is for someone to hit that market spot. Cheaper upfront, or much lower power. Either will do.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
There are currently 1 users browsing this thread. (0 members and 1 guests)