Read more.Next DX11 iteration to be released before year's end, says AMD chief.
Read more.Next DX11 iteration to be released before year's end, says AMD chief.
16 million dx11 gpu's sold and hardly any profit made.
This tells us that the real money is in the professional card market - something which nVidia still dominates.
God help us if AMD goes under, because nVidia would just stop bothering with gamer cards at all. Actually they already did that with all their rebranding of G80, which is why they are so far behind ATI now in this area.
Wow, have you got the wrong end of the stick.
There's almost zero money in the professional card market. True, there's a larger profit margin per unit, but there's tens of thousands gaming and consumer cards sold for every workstation card, and that's direct from our nVidia rep here at the university. There is a minute margin to be had in the professional sector, and it looks good for their green credentials, hence nVidia are in that segment. The ECC memory and L2 cache development in their architecture was a hard graft, but it really does help sell parallel processing to the workstation market. That, and the fact that CUDA is a lot easier to learn than Stream, and I can testify to that.
Workstation cards were borne from the consumer graphics industry, and that's what pushes the margins forward so they can develop both aspects. "No GPU company would ever think about solely putting all their resources into the professional and workstation market and ignore the gamers" - yet another quote by our NVIDIA rep. Why do we have an NVIDIA rep? Parallel programming, and NVIDIA are trying to advertise their hardware to the popular universities where areas hadn't considered using GPUs before.
You have to be joking surely.
ATI have blown nVidia away in the gamer market for 10 months and barely made a profit. nVidia still posted small profits while losing tons of market share.
The real gpu cash is in the professional market where margins are huge, and that's a fact. Where else is nVidia making $1bn revenues? Tegra?
I mean really, how did nVidia make $1bn last quarter with no dx11 cards while ATI made $400m on a full dx11 lineup? It surely wasn't because of Fermi.
Core 2 Quad Q9450 3.2GHz | Asus P5Q PRO Turbo | 4GB DDR2-800 | (2) 1TB Samsung F1 | Radeon HD 5850 | Windows 7 x64
Core 2 Duo E6400 | Abit IP35-E | 4GB DDR2-800 | 320GB Seagate 7200.10 | Radeon HD 4830 | Linux Mint 9
well,
AMD invests seemingly way too much in R&D hence why their profits are lower
so essentially AMD are for way in the future and nvidia are for just in the future
But they spend far less than Intel, who are more profitable. Nope, R&D costs aren't why AMD don't make money.
Global Foundaries is still losing a lot of money, and AMD have a 28% stake so that's a drain. In addition they don't have a mature 32nm process for CPUs, so their CPUs cost more to make than Intels, while generally being slower, meaning they have to cut the average selling price to remain marketable, which cuts into their margins heavily.
Basically, desktop CPUs suck, but AMD knows it needs to keep them to be able to keep up R&D for server chips where margins are higher. AMD traditionally punches above it's weight in that area, but they have lost a LOT of share to Intel recently with Intel's Bloomfield and lynnfield chips proving very popular. Magny Cours is helping AMD back, and they are really hoping for success with their 4100 chips - *if* cloud really takes off they will be very well placed to take advantage of it.
AMD graphics on the other hand has been fairly profitable, but supply constraints have effectively nullified their lead in this area. Whether that situation has been engineered or at least encouraged by competitors is another area of speculation. But one thing AMD have been good at in the past is engineering around manufacturing limitations - their 6000 series may well prove that to be the case as well.
There are currently 1 users browsing this thread. (0 members and 1 guests)