Read more.Thin and light Core i7 laptop can be boosted by Thunderbolt 3 GPU enclosure.
Read more.Thin and light Core i7 laptop can be boosted by Thunderbolt 3 GPU enclosure.
Price: from $999 (up to $1599)
Ridiculous that they have not included any kind of dedicated GPU built in. I can't exactly take that GPU case on a plane.
Went and had a look at the product page for the smartwatch thing as I was intrigued to see what they had done to make a smartwatch that was specifically 'for gamers'.
Turns out: nothing
"I want to be young and wild, then I want to be middle aged and rich, then I want to be old and annoy people by pretending that I'm deaf..."
my Hexus.Trust
I think this is excellent. On the go lightweight and battery life, and more grunt at home when you're plugged in. I wish Dell had done this with the XPS15.
If the screen has good colour range and calibration ability and no stupid DCR (unlike Dell's XPS13) then this could be where my money goes.
Don't see why it would be 4x? The bandwidths stack: Thunderbolt 3 = 40Gb/s. PCIe3(16x) = 32GB/s.
The enclosure provides USB3 ports x4No. This is not what is uses for the graphics connection. The spec list states it will run one card at PCIe 16x and connects via thunderbolt3.
Can Hexus run a review of this? Better still a group test comparison of this and similar portable high end stuff like the XPS, Alienware 14, Aorus, Asus, MSI and the like.
Last edited by ik9000; 07-01-2016 at 01:44 PM.
I'm not exactly sure what you want from a laptop. If it's gaming on the go, get a bigger heavier laptop that has next to no battery life and makes a lot of noise. If it's portability, get one of many Ultrabooks.
This is for everyone else who wants the best of both - desktop class gpu power with a laptop they can unplug and carry around all day without running out of battery or getting tired..
My main concern would be the CPU is going to bottleneck the gpu with anything high end..
How I wish that this was how it worked, but unfortunately, no. It's a single Thunderbolt port at 40Gb/s that hosts 4x USB3 (5Gb/s each) ports. This leaves 20Gb/s of available remaining bandwidth if all 4 ports on the Core are used. If only a GPU is being used, this would optimally translate to 5GB/s if we don't count lane width differences, clock translation differences, and queue/cache setup to smooth those differences. Since USB3 was modeled after PCIe 1.0, I assume there is enough similarity to get close to an optimal translation between the two.
Either way, the end result is much closer to PCIe 1.0 16x (4GB/s) than it is to PCIe 2.0 16x (8GB/s). For any games (the majority) up till now that avoid heavy GPU memory thrashing (see Wikipedia), the resulting bandwidth decrease should be relatively unnoticeable at 1080p or 1440p (no idea on 2160p). However, with the number of future games that are implementing various voxelization techniques for GI, physics, ray-traced shadows, distance fields, and etc., this thrashing increases significantly, and the result will be VERY noticeable.
Platinum (07-01-2016)
this what gamers do: CONSOLES....none, LAPTOPS....a heavy one + 15"-17" + something like 960m, DESKTOP... i5 + 16GB DDR4 + GTX 970 + 27" + 5Tb spinning hard drives.
Also according to wikipedia "Intel will offer two versions of the controller: one that uses a PCI Express 4x link to provide two Thunderbolt 3 ports", so not sure where the 40gb/s comes from if it only uses a 4x link?
Looks very good!
Even with the potential bottlenecks, still the most robust external GPU solution we've seen so far.
There are currently 1 users browsing this thread. (0 members and 1 guests)