Read more.That's a >40 per cent pin increase for what is rumoured to be the first hybrid desktop design.
Read more.That's a >40 per cent pin increase for what is rumoured to be the first hybrid desktop design.
with 1700 pins the actual socket cost and cpu cost is going to be forced to be high simply by component count
If this is not a huge increase in performance then it will not be received well
Old puter - still good enuff till I save some pennies!
Since the release of 75mhz Pentiums with errors. It's been a wave of errors and bugs and poor performance for intel. The last 3 years, they totally sunk. Will take some time, if they ever manage to come up with a competing product again. And if they do, do we wanna go back ?
I don't need an Intel product to slow down my machine again.
They've been on 1150-1200 pins for all the dual channel CPUs since Nehalem in 2008 and even the short-lived triple channel CPUs were only 1366 pins.
500 extra pins is not something Intel would do lightly though, so why now?
Quad channel? Tripple channel? More power planes?
This might mean new CPU coolers too.
Seems a bit weird how it's DDR5; it's not even minimally available yet. Normally servers use it for 2 years or so before seeing it on desktop
With the the addition of the new AMX instructions, they are planning to do a lot of tiled memory processing for ai. Convolution parameter requirements keep doubling every few months, we are told, so it would be a big win to have a separate external memory busses for parameters, program, data ... like the DSP chips have provided for a long time. That's my guess for the extra pins.
They also need to beef up the PCIE4 pin count so they have enough pins for both the PCIE4 SSD and PCIE4 GPU.
btw, since their Sapphire Rapids and Ponte Vecchio GPUs will use PCIE5/CXL in late 2021, why would we not expect GPUs and desktop cores and GPUs with PCIE5 during the same timeframe? I don't recall seeing a roadmap showing Alder Lake with PCIE4 or PCIE5.
Yes and no. Bear in mind the recent debacle over motherboard support. CPU sockets =/= guaranteed compatibility. It depends on how often you plan on swapping CPUs, but an expensive, high end mobo might only support new CPUs for 3 years from birth. So, if you buy a mobo towards the end of its run, you might find in 2 years it ceases to be compatible with new CPUs even though the socket matches.
The physical socket is only a set of pins and there are a lot of technical deeliemabobs to support each CPU. You don't necessarily know what those deeliemabobs are going to be in 3 or 4 years time and the last thing a company like AMD needs is to restrict the performance of a new generation of CPUs and possibly lose out to Intel, just for the small number of people who have a 4 year old mobo and want to upgrade.
There are some advantages to Intel's approach in this regard. For people who are buying expensive, high end CPUs, you're pretty certain to need to replace the mobo when you upgrade (assuming the longer upgrade cycle). The people who benefit from this are those who buy a motherboard and CPU but can't afford / commit to an expensive CPU. So they'll buy cheap to get the build going and then upgrade in a year or two as finances permit. They just need to be careful that the mobo they have chosen is going to have vendor support as well as AMD support. AMD might support the socket, but the mobo manufacturer needs to release BIOS updates for your specific product.
EDIT: Intel's approach certainly removes uncertainty and complexity around support. It would be very easy for someone learning their way around the PC world to look at a motherboard for their first build, see "AM4" and not realise their chosen CPU just won't work. AMD's approach is more consumer friendly, but opens up some extra pitfalls for the uninitiated. Problem there is that if you make something like the custom PC market have a higher and higher barrier to entry, it will eventually become unprofitable. Then we're back to the idea of CPUs soldered onto motherboards along with RAM and it all sold as a bundle.
Last edited by philehidiot; 29-06-2020 at 02:29 PM.
Old puter - still good enuff till I save some pennies!
Please Intel, who is running your guys QA?!?!? And who designs this?!?!?
I it is so many times less efficient having to replace the Motherboard every time you guys make something new... who is it in the Intel Department who did not get the memo.. and why has the people reponsible not been fired yet for costing the company money... and worse... and much much worse.. hurting the potential customers.
Could be inetraging one of those Xe tile graphics
big.LITTLE on desktop sounds like the least exciting new USP they could have implemented.
No lessons learnt. It's a shame.
There are currently 1 users browsing this thread. (0 members and 1 guests)