When you are overclocking, do any of you take into account thermal paste 'burn in' whereby the thermal paste inproves in efficiency with time?
A bit over a year ago I decided to overclock my CPU, bought a new cooler, read several guides, and successfully managed a mildly respectable 3.2GHz on my E6750 (2.66GHz at stock).
Before I started, I stress tested the CPU at stock with the stock cooler, and freaked out a bit when it sailed past 75c after about 45 mins!
After putting the new cooler on with the shiney new thermal paste, the temps dropped to just under 65c on stock speeds after a couple of hours stress testing.
I then overclocked the processor (including a drop in voltage to try and lower the temps a bit), and ended up with the temps being just under 65c after an overnight stress test.
Flash forward a year or so, and I'm getting ready to switch out my E6750 for my ebay-tastic Q9400 to tide me over until Sandy Bridge has matured (possibly even until Ivy bridge has come out). I decided to baseline the temps and found to my amazment that after several hours stress testing, the temps are staying south of 50c. In fact, the total increase in temps from idle to load is <10c.
Ambient temps are roughly the same (the heating is on in the room, so the temperature is roughly what it would have been when I first overclocked), so the only thing I can think of is that this 15-20c drop in temps is due to the thermal paste burning in.
I had no idea there could be such a large effect over time. If I had known, I might have revisited the overclock after a little while and shot for a higher clock speed.
So do you any of you take this into account when you stress test a new build?