This is huge - it's like a new startup claiming to match the performance of a top-end desktop CPU with a chip that only needs 0.1 W. Convective heat transfer is really hard, so the only way to deal with the heat flux from a modern chip is to massively increase the surface area (by ~3 orders of magnitude for a typical fin stack). With this product offering a convective heat transfer coefficient 3 orders greater than air, you can skip the whole thing, and a lot of heat transfer textbooks will have to be re-written.
Normally, the thermal resistance of the heatspreader and such is negligible when you're cooling a CPU/GPU - it's so hard to get the heat into the fins that the bulk of your temperature difference is there, and so you only see a degree or two across the packaging and the like (for a typical GPU heat flux, ~1C/mm of copper heatspreader thickness out of the 60C between the GPU and the air). With this design you can shed GPU-level heat fluxes (~300 kW/m^2) for ~2 degrees temp differential between the chip and the coolant, which is insane and explains why they're complaining about the thermal resistance of the heatspreader.
Whichever GPU manufacturer jumps on this first will have a field day until the patent runs out. Cooling like this covers a multitude of sins in the silicon design - you'll be able to shed silly amounts of heat as the negligible temperature difference between the coolant and the chip means you can run the coolant hotter, which gets you more heat shed from a given radiator size for a given airflow. It should also massively reduce hotspots in the chip - if a section of chip gets 10 C hotter than the rest of the chip it'll be subjected to ~1 MW/m^2 of cooling locally, so will not sustain the temperature difference for long
Those use abrasive grit in the water