Read more.Document reveals the 16C/32T chip can consistently OC to 4.3GHz on all cores.
Read more.Document reveals the 16C/32T chip can consistently OC to 4.3GHz on all cores.
While I'm not against overclocking (looks at my ageing 3570k with decent OC) I'm wondering if all core OC is a good idea for gamers? Wouldn't all core OC just increase the chip temps and limit boost so you no longer hit the max single core speed? I know games are beginning to get better at threading however I'm sure most games still benefit from boosting say 2 cores higher than 16 to a lower degree? If you need raw CPU multithreaded grunt this does not apply of course (rendering etc) but for todays games I think it does...
To be fair I wouldn't necessarily say gamers are the main target for the 3950x, I'd say they're more aimed at 'home' content creators and 3D etc.
Would I overclock all the cores, probably not as it will reduce the life expectancy at least a little if you're working it heavily with rendering and encoding etc
I would think that, at least in part, it *is* aimed at gamers, since it will have the highest advertised clocks of any of their CPUs, and we all know gamers want every little edge they can muster to pull the most FPS. (Or at least that's the logic behind halo products.)
It does seem like have differentiated cores on higher core count CPUs would make sense, especially with Windows having better and better support for core affinities. Being able to clock higher on, say, 4 cores, while the other 12 are "high enough" seems like it would be the best of both worlds.
Indeed for gamers I suspect the higher clocks on fewer cores would be optimal. I wonder if it boosts based on workload? If you're doing some mighty rendering work and are loading all the cores, it may decide that clocking all cores to a lower level is better than boosting to all.
I also wonder how much of this is to stay within the TDP envelope? We all know that Intel cores do a lot of their boost behaviour to stay within certain TDP specifications rather than due to real thermal limits.
One thing I would say (which I think you were also suggesting) is that gamers wanting "the edge" is often just wasted money. Anandtech showed that for gaming an i3 paired with a better GPU is a far better spend. I think once you've maxed out the GPU, throwing monies at the CPU often results in spending a fortune for little appreciable gain.
I look at something like this and go "yeh, I could absolutely afford it but am I ever going to come close to taxing it? Isn't something like this wasted on someone like me?" I just don't render videos whilst I'm gaming and playing HDR 4K on a different monitor next to me.
I think for all but the wealthiest gamers, this kind of stuff is best left alone (even if the budget allows it) and the money saved and put towards the next upgrade or towards upgrading things which are often overlooked. Like sound. Or a HD floppy drive rather than that DD drive most people are sporting.
AMD is best for game, but intel is more suitable for working.
There are currently 1 users browsing this thread. (0 members and 1 guests)