Read more.So has presented some research on microchannel liquid cooling in its 3D chips.
Read more.So has presented some research on microchannel liquid cooling in its 3D chips.
I bet we'll see chips converging on the same setup as modern waterblocks - i.e. trenches with a jet plate directing the water through them.
While i love the sound of things like this and i don't think their big brains would be thinking about this if it wasn't possible, i do have questions. Firstly isn't there a limit to the diameter that water will pass through (without insane pressure), and how (or even would) the water come and go from the chip (tiny little barbs? Like a heat-pipe or vapour chamber? or something else).
Are we then moving into a realm where processors may end up requiring watercooling and giving manufacturers an excuse to solder the CPU to the board and fixing their own cooling system?
So how much is this going to cost?? How is this practical?? Water cooler is on a minority of current systems,and it has longer term reliability problems,and needs more maintenance.
These companies need to think of cost effective ways to do things,not pie in the sky stuff which is useless for 99% of the market.
I would expect this would act similarly to heat pipe esque cooling where the convective currents caused by thermal expansion then being cooled and passed around would be something to what they're looking at.
I highly doubt the inter channel water cooling like this would be comparable to large scale water cooling of a large system. I would strongly suspect this would be part of an integrated solution using either a specific liquid designed for this system or something akin to purified water (so no chance of gunking etc)
This sounds more complex than heatpipes,because the channels need to be tiny,and then you would need to hermetically seal the whole chip. There is additional packaging costs involved here.
It still sounds expensive to implement due to the added complexities of manufacture,will probably take more space up and has more chance of failure longterm? That is on top of the extra costs of the new nodes,which need to be amortised too. Unless this can be made very cheaply,I can't see it being of much use for the majority of systems sold out there,be it smartphones or X86 PCs,tend to be more entry level/mainstream systems where cost is a factor. OEMs,also don't want added costs either!
The problem with these companies they think consumers have an endless amount of funds/debt to be able to fund all these things.
Last edited by CAT-THE-FIFTH; 13-07-2021 at 02:13 PM.
I imagine they have done and to increase efficiency and capabilities further, they have to look at these kinds of technical innovations. If they can make a consumer CPU that is doubled in efficiency and capabilities (say like the Ryzen 5950X with the increased cache die but look further afield at things like 3D stacked cores, cache and graphics for laptops) then the offset cost decrease due to manufacturing cost reduction (interconnects on substrate and all that rubbish) is annulled by utilising exotic cooling methods like this.
In this cooling methods case, it already somewhat exists in that massive wafer scale engine IIRC.
The nail on the head has been hit, only cooling one side of a very tall tower is going to lead to cooling efficiency drops and will provide an inferior product so they need to look at cooling in more ways than the traditional one side method. As with everything, it'll be expensive to start with but as methods improve and manufacturing gets more scalar and efficient, then eventually the cost felt by the consumer will just wash out. Like with any technical innovation.
Even AMD is implementing its addtional cache layer on CPUs which cost over £500. It's great(like a Bugatti Veyron is great),but only a very niche market for the average consumer. Things such as HBM2,also only appeared in some niche Intel CPUs. Intel Lakefield also hardly appeared in many systems. The issue is the complexity and added costs. Hence,its only going to be expensive items which will probably see this.
Also WRT cost reductions they have been used to increase margins,so you are seeing prices go up further than inflation. We forget we are on an enthusiast forum,where we tend to throw more money at tech than most,but all I am seeing its lots of non techies keep their stuff longer and longer. Even things like smartphones were doing the same,with mainstream/entry level models getting relatively worse and worse,with worse hardware until Chinese companies with their lower margins re-applied pressure.
Its now happening with PCs,where companies are just jacking up prices more and more,with big improvements in multi-year margins,and revenue. I don't know how this is going to sustainable,especially with real inflation added on top(due to the money printing),and the increased economic uncertainities in many sectors. ATM this is being propped up by record levels of consumer debt.
The thing is why its all great on a technical level,the reality just like with stuff like HBM,Optane,etc its the most cost effective technologies which win out. VHS won over Betamax and so on. This is why I am saying it has to be cost effective,because its no point if it adds more cost to the consumer. If it does then its going to be niche until the costs are more palatable,and the vast majority won't see any real benefits for a very long time.
This is why I can't see "3D" stacked chips(all chips are 3D),etc being that mainstream for a while. They have existed for a few years already.
This strikes me as more of a HPC/server thing, we'll have a while before it filters down to desktops.
Channels are easy, just use the same tool you use to cut the dies apart to scribe them. It's already available in fabs, very precise, and doesn't need expensive EUV machines for the perf/mm2 increase this gives
Wouldn't that also make the dies themselves larger?
You still need to seal it hermetically,have a reservoir,have a way to monitor liquid level,etc. Then will it integrate into existing fansink combinations,or need a specialist system? That sounds like added complexity,which means more cost.
I do agree it sounds more like an enterprise thing were custom solutions and costs are less of an issue.
Costs have fluctuated dramatically in the past couple of years, we saw a dramatic down turn in costs per core after AMD brought back competition then Intel stopped being wholly competitive and AMDs prices stayed high because the demand was high and supply was nigh on being consumed as soon as it was on a truck to disti. Intels prices have come down dramatically and when Intels competition comes back we'll likely see a downturn again. But I would like to point out that unlike the Sandy Bridge to somewhat post SkyLake era, the actual improvement per core was limited while AMD (with the exception of Zen+) has kept a strong improvement per core per cost. Zen 3 was a somewhat poor offering for cost per core over previous offerings but then again, what isn't being jacked up in the crypto/pandemic era again.
From what I can read, Household debt in the UK has only increased 11% (compared with something like 90% in the US) while being no small value to scoff away, our glorious overlords have promised utter glory and money falling from the sky so this should hopefully start to see a downward turn. However, I don't expect that the two are mutually exclusive between increase in corporate pricing of consumer goods, I just don't observably see that they are veritably a key causer (likely an area of contributing effect).
Isn't that the same with any new revolutionary technology? If we were to reject these technologies because they're expensive initially then we would make very little progress in anything, there would be no reason to spend much on R&D, just keep iterating over the same ol' same ol' (see Intel 2011-2018).
Your usage of the term "3D" is quite liberal and although you're not wrong, you're using the right term in the wrong way. Firstly, the area of 3D being discussed, which Lakefield, Foveros, the prototype 5950X and other TSMC/Samsung based 2.5D/3D technologies are regarding the stacking of logic dies. Secondly, you're not wrong that 3D stacking has been around for a while, it's one of the principle ways that NAND storage has managed to get so dense through nand wafer layering. There are some other use cases in smaller FPGAs and other light logic touch areas but true general purpose logic engines have not really seen any form of 3D stacking due to the incumbent problems with thermals and inter layer communications. And lastly, yes the circuitry within a processor or semiconductor can have a "3D" element but that's on a wafer layer, the terms that i've noted are the addition of distinct other dies. I'm sure you know this and don't need it explained too much however!
It's like the argument that path/ray tracing has been around for years, yes it has but this is a different evolution of it.
Last edited by Tabbykatze; 14-07-2021 at 08:21 AM.
Why not use a better resource than Silicon?
sure, using HOT running amd, chips will need this type of cooling.
There are currently 1 users browsing this thread. (0 members and 1 guests)