Read more.der8auer measured cores 6 or 7 degrees cooler but warned not to expect many more MHz.
Read more.der8auer measured cores 6 or 7 degrees cooler but warned not to expect many more MHz.
Intel have had to improve thermals as they've thrown loads of power at the chip to get extra performance. I expect most people to experience poor OC headroom.
I would like to know why CPUs were given heatspreaders? I remember back in the day we didn't have them atop CPUs and on a certain model of AMD chip you could use a pencil to draw on a connection to unlock the multiplier. It makes sense that, if done well, a heatspreader would increase the area available for removing heat from the CPU but, if done poorly, it reduces thermal conductivity.
Is it just because most coolers are cheap and in those cases a heatspreader is better?
Also remember it improves processor durability as the process gets smaller the actual chips get smaller and the contact area is reduced. look at the contact patches for the above processors - putting a cooler on badly could easily damage them and you get an extra layer of protection
Old puter - still good enuff till I save some pennies!
Well the clue is in the name really, a heatspreader spreads heat. Provided there's a half-decent thermal interface between the die and the heatspreader, it makes effective cooling far more straightforward and reduces risks of hot-spots on the die, reduces risks where the heatsink base isn't completely flat, and in most real-world cases will simply give better thermal results than a bare die touching a heatsink base - the heatspreader will typically be copper (very conductive) and effectively takes heat from a very small surface area (high power density) and spreads it to a much larger surface area (lower power density), making the thermal interface between it and the cooler much less sensitive. Even in this video he replaces the stock thermal interface material with another one, the heatspreader stays right where it is to do its job.
'Back in the day', CPUs didn't have anywhere close to the power density seen on modern processors so were much easier to cool effectively.
The heatspreader also adds a great deal of mechanical strength to the CPU and amongst other things prevents the risk of cracked dies.
In a handful of edge cases a bare die might provide marginally better results, and people who want to achieve that are free to delid. Silver is only marginally better than copper in terms of thermal conductivity and you'd need more than a plating for it to be worthwhile anyway.
Edit: Some figures to back up what I'm saying about thermal conductivity.
Copper: 401W/(m.K)
Silver: 419W/(m.K)
At room temperature. Source: https://neutrium.net/heat_transfer/t...ls-and-alloys/
So it's one of those things where, if you're obsessive and you know what you're doing, removing the heat spreader can produce better results. But, in most situations, it improves thermal performance as idiots like me just splat on some paste and wack on a cooler, going "LOOK MUM! I BUILT A PEE CEE!"
Makes sense.
Pretty much. For the majority of people I'd say it's beneficial and avoids all sorts of problems, not least because it means PC integrators or builders can just slap on a cheap cooler and not have to worry about it.
There are definitely times where the interface between die and heatspreader has been poor, for instance some of Intel's older CPUs where the die could be upwards of 90C but the heatsink barely lukewarm leading to people thinking it necessary to get some sort of expensive AIO to cool a stock quad core CPU - the thermal capacity wasn't the problem, the transfer was just incredibly inefficient because of rubbish TIM instead of solder. It's like trying to fix a slipping clutch with more revs - you might end up brute-forcing through the problem but it wasn't the engine's fault the wheels were barely turning!
Yeh the TIM issue was one reason why I was wondering "why bother" as delidding seems to be more and more of a thing, enthusiasts use decent paste and coolers so why bother sticking another point of failure in there as well as another manufacturing process to pay for.
But there are real enthusiasts and wannabe enthusiasts. I'm the latter and do not really expect a handful of extra MHz on my OC or a few degrees cooler running would be something worth the hassle. I expect the approach they've taken is best as those people who want to really go at a tough overclock will probably relish the extra challenge of delidding.
Besides the mechanical advantages, using a mediocre thermal paste (or an excessive gap) between die and heat spreader is no better than using the same TIM between the die and heatsink directly. Many high performance parts now use soldered heatspreaders so there's a direct metallic path between them. Note how the 9900k with soldered IHS had better temperature readings than the previous products with thermal paste.
Edit: Some measurements of that: https://www.tomshardware.com/uk/revi...u,5847-12.html
A bare die also increases the chance of the CPU die being damaged during assembly.
It's a bit of a misnomer IMO as while it technically spreads the heat it doesn't do it any more than a heatsink would if it was in direct contact with the die, a more apt name would be loadspreader as that's the primary purpose of them, as dies got smaller the chances of chipping or cracking the die increased. Balancing a big heavy thing on a small bit of what's essentially glass and worse yet apply pressure to it with clips or screws can cause all sort of problems, especially if you've got little experience in building computers.
You're also right in what you say later that direct die contact doesn't really buy you much more thermal headroom (maybe 5-10°c) so it doesn't really effect maximum clock speeds a great deal so i guess because they were seeing slowly increasing RMA rates and customer dissatisfaction as dies got smaller they wanted to address that issue more than what at the time was pretty much a non-issue.
There are currently 1 users browsing this thread. (0 members and 1 guests)