@CampGareth: It's not necessarily as straightforward as that for Intel though, they have a fairly big IGP now which they're not going to just drop, and there's not much room for double cores+cache while keeping die size reasonable for a consumer part.
AMD have 8 integer cores, but that's because they're significantly smaller than Intel's. A SNB core is ~20mm2, Haswell about 15mm2, a BD *module* is about 20mm2. The big Xeon dies are upwards of 500mm^2 and go for thousands. Even without adding an IGP, something that large isn't entering the consumer space any time soon. Yield goes down (and hence cost goes up) with roughly the square of die size; i.e. twice as large = far more than twice the cost.
If what AMD are saying about Mantle holds any weight Intel will not want to be getting left behind.
People will be wanting (cheap) lower clocked higher core count cpu's.
http://www.hardware.fr/medias/photos...G0043521_1.jpg
So 2014 will see nothing new for the AM3+ users?
Your new motherboard will easily take one. The other ones I mentioned to you have VRMs rated to 276W. To put in context the FX9370 and FX9590 are pre-overclocked CPUs, and I would suspect that they might not be different in TDP from somone overclocking their FX8320 for example,and it could be possibly lower.
I'm not really sure that's something we can solve, however we can solve heat density at least, bigger die size is bad for costs sure (makes you wonder why high end CPUs have held at roughly a similar price when die sizes are going down, hopefully yield drops didn't make enough of a difference to balance it out) but they're also great for heat as you'll get more surface area going. It might be worth having a look at something like a Q6600 with the IHS removed, to my untrained eye I'd say there's at least another 2, maybe 3x more die area there than in an ivybridge CPU (which is perhaps closer to an atom from ye olden days). If it worked in the past, why wouldn't it work now?
AFAIK, the price of fabbing a wafer has increased significantly over the last few nodes. So the smaller dice are only offsetting the increase. Start making the die larger, and not only do you start hitting yield problems but you also end up with a more expensive product. NVidia really suffered for that with both GF100 (GTX480) and GK110 (Titan/780/780 Ti). It took them a long time on both of those chips to get sufficiently good yield to release a fully-enabled part (in fact, I don't think they ever released a full GF100), and the products were very highly priced.
So yeah, increasing die size simply isn't an option, because you'd have to price the chips too high.
Also, die size doesn't affect heat, per se - it affects temperature. A 100W TDP chip will generate up to 100W of heat, regardless of die size, and a 100W capable cooler will be able to maintain that chip at a steady temperate. That temperature might be higher than we've come to expect, but generally that's not going to be a problem as long as the parts are engineered to operate at that higher temperature.
Die rises and falls in line with uarch modifications and die shrinks respectively. Kentsfield (Q6600) also wasn't a monolithic quad core, it was essentially two dual core dies on one package welded together by the FSB. Overall, die sizes are not going down if you look at the whole picture. The first processor after a die shrink will be relatively small, but it's also made on a cutting edge (therefore expensive) process.
Heat density isn't in dire need of solving TBH - the temperature problem with IVB/Haswell is exacerbated by using TIM instead of the solder which had been used for years.
Oh well then :/ *sells a kidney to afford a couple of intel 10 or 12 core CPUs and 512GB of RAM, waits a decade for it to be obsolete*
would seem AMD are looking at offloading FPU to the ondie IGP
... which would seem - to me at least - to be a pretty sensible way to approach it. Remember that modern graphics hardware has floating point processing capability that would have been in the realm of supercomputers 20-30 years ago, (maybe even less - it's been a while since I had "hands on" with a Cray alas). After all, isn't that exactly what CUDA and OpenCL are trying to achieve?
The problem I have is that - on the basis of this article - it would seem that AMD's theory is that everyone wants an APU rather than a "proper" (no insult intended - perhaps "conventional" would have been a better description?) configuration consisting of a separate CPU and separate GPU.
APU's start to become interesting to me if, and only if, they can match the processing power of a discrete CPU of a similar age. Yes, AMD's APU's are undoubtedly powerful, but imho they're still the "cost conscious" option.
but
if AMD are now *only* making APU`s - that's the way forward for them - AM3 a dead socket - FM3 being next gen and cross platform?
They can still play with the mix on FM2, they already have Athlon and A series parts on there, and it sounds like they almost went for a 6 core part for Kaveri so maybe next time.
If they can get OpenCL properly adopted then the on die GPU isn't wasted even with an external GPU. Get the on die shaders to calculate which way your tressfx animated hair moves, get the external GPU to actually render it, nice pipeline there.
Alas I have pretty much deduced the same from all the roadmaps.
It does look increasingly as though AMD may well sacrifice the FX line to the gods, since AMD is going out of its way not to mention it at all and hasn't denied anything about it either.
I sometimes find myself thinking, "Maybe the new APUs will punch above their weight and at least provide some competition to Intel's i5 line. Maybe they'll even start to infringe upon the current FX chips." When I think calmly and rationally about this however, my doubts start to build.
I'd like to be optimistic about the situation, but let's face it: this is AMD we're talking about here. AMD is amazing at selling dreams and delivering something rather more moderate.
Don't get me wrong: I'm not an Intel fanboy. Far from it. My current rig is based on an ageing Phenom II X4 940. For the time, it was great. Alas AMD then went from AM2+ to AM3 and then AM3+ right afterwards, leaving me with no upgrade path at all. My motherboard can't support the Phenom II X6 range.
I remember the days of the Athlon 64. Those were good days. AMD punching Intel where it hurt and doing it at a fair price. But then Intel redoubled its efforts and AMD has never quite recovered in the CPU field. The Phenom II line did manage to give a decent push and my own processor is still allowing me to play games at 1200P, coupled with my pipline-unlocked 6950.
The problem is that there are a lot of people like me who need to know their upgrade options. Right now, the only line that seems it might have an upgrade path is the FM2+ socket; AM3+ seems certainly dead in the water. Intel, for all its extra costs, at least has the certainty of one processor upgrade with Socket LGA1150, so those with Haswell can at least look forward to Broadwell.
I wish AMD would just come out and be honest about the situation, rather than sitting back and not saying anything. It pains me that I might have to recommend an Intel-based system to my girlfriend when she builds a new PC in 2014 if AMD can't pull itself together.
There are currently 1 users browsing this thread. (0 members and 1 guests)