Read more.G-series with Radeon on-package.
Read more.G-series with Radeon on-package.
This is really interesting. Can't wait to see what type of designs these show up in. I would love some relatively thin-and-light 13" designs with some serious gaming chops, though the battery life would need to be decent too. Would absolutely love this in a not-too-tiny form factor for a HTPC upgrade too. STX or ITX, so that it fits a decent cooler? Yes please.
One thing though:
"In the first, comparing a Core i7-8705G chip carrying the lesser RX Vega GL graphics, Intel's testing found it to be comfortably faster than a laptop featuring its own Core i5-8550U chip mated to a discrete GeForce GTX 1050 4GB graphics card. The comparison is somewhat unjust as while the 8550U is also a 4C/8T part, its meagre 15W TDP and all-core speed is much lower than a H-series chip that this should really be compared against. That said, in gaming at 1080p, the 65W Core-i7-8705G appears to be the better bet."
This seems like a misreading of what Intel is showing. The comparison seems to be based on similar power draw (15W 8550U + ~50W GTX 1050 vs. 65W G-series). Using a 45W H-series CPU in this comparison would either limit them to using an MX150 (~25W, but less than half the performance) or getting into 80-90W territory. After all, isn't the whole point of this integration to do more in a smaller space for less power (due to integrated power delivery shenanigans)? After all, it's not like regular H-series CPUs go into a ~25W cTDP-down mode while gaming (which would make your proposed comparison more of a level playing field).
Disregard, I got the orders of magnitude mixed up
The first laptop announced with the new Intel SKU:
https://www.anandtech.com/show/12221...deon-rx-vega-m
This was exactly my thought - the whole point here is to demonstrate the fact that you can flexibly adjust the power requirements across the package. So you don't have to choose between a fast CPU or a fast GPU - you can have both, and the processor management will assign power as necessary given the situation. Running a CPU-intensive non-graphical task? You get the benefit of a 45W CPU. Running games? You get the benefit of a 45W GPU. But you get it all in a smaller lighter chassis, as you never have ti dissipate as much as 90W.
EDIT!
Perfect example right here - The Spectre X360 that article talks about comes with either a G-series processor, or an i7-8850U + MX150. So in that chassis you can have an H-class processor and GTX 1050-equivalent (or beating) graphics, or a U-class processor and GT 1030 graphics. That's really not a difficult choice...![]()
One of the mods on OcUK forums pointed this out in the AMD CES thread:
http://www.eurogamer.net/articles/di...-spec-analysis
It looks like the top SKU has 64 ROPS which is highly unusual for a part with only 1536 shaders which might explain why it apparently can compete with a GTX1060 Max-Q. The lower end SKU with 1280 shaders only has 32 shaders and looks to be in-between a GTX1050 and GTX1050TI.
It makes me wonder how ROP limited AMD cards are,and wonder if that is holding back performance - even Vega64 only has 64 ROPS.
Interestingly the terminology on the slide (the second one on page 2 of the Hexus story) is "64pix/clock" and 32pix/clock", rather than just a ROP count. I wonder if AMD have found a way to make one ROP dispatch 2 pixels per clock?! Alternatively, perhaps AMD's ROPs are horribly inefficient?
I notice Nvidia went the brute force approach with their new SoC: 315mm^2 die size.
https://wccftech.com/nvidia-drive-xavier-soc-detailed/
Looks like their Denver CPU is back in an 8 core incarnation. Shame there isn't a mass market platform that can make use of it, though I suppose it would make a stonking if expensive Nintendo Switch.
Chase 1050's all you want Intel. Same story as before. NV doesn't make their money on the low end. The mass majority of their profit is from the high end period. If I have to turn anything down, you've already lost me. A move in the right direction for useless gpus included on cpus, but nothing to worry about for NV. How much did the bottom end cost NV when AMD went low and NV went 1070/1080 etc only? OH right, NV set records....LOL. You don't get rich off poor people much unless you're a dictator running a country into the ground and that usually only lasts until they figure out how to kill you (or someone does it for them...cough**USA**cough).
Intel is making records on the rich. HEDT anyone.
Not sure why you think it's underhand, it's not like they're trying to hide anything. A highly-integrated package will make device designs easier. It's hard to imagine they didn't approach nvidia about this at some point, but AFAIK nvidia don't have a dedicated semi-custom hardware group, so working with AMD was proabbly just easier.
Theoretically, with so much of the hardware being tightly integrated, these should be smaller and consume less power than a similarly performing device with separate CPU and GPU packages. It might not be a big difference, but in laptops a percent or two can make a surprising difference.
Intel are still producing standard CPUs, and OEMs will still be free to create laptops with Intel CPU and nvidia GPU. BUT if they go for these chips more of that spend will go into Intel's pockets, which is obviously good for Intel.
Probably went AMD as they got more for their money. They were paying (still are for a while maybe?) nVidia for some of the IP they use in their GPU cores. The license was ending, so maybe AMD offered to cover those patents and add some extra incentives in to make it a no-brainer for them...
I also wonder if there is still some bad blood after Intel ended nVidia making Intel compatible chipsets....
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Nvidia took Intel to court over their chipset business,and the IP licensing deal was part of the settlement they made out of court. Another aspect is Nvidia makes a massive amount of their revenue from consumer GPUs,and that is funding for their AI efforts,and Intel is massively investing in that area,and Nvidia are a major player now. So,I think its more a way of Intel using AMD to fight NV.
Edit!!
Its also good short term for AMD,as they hardly have a market in higher end laptop cards. Try looking at how many mobile GTX1060 cards there are and then look for mobile RX570/RX580 equipped laptops.
There are currently 1 users browsing this thread. (0 members and 1 guests)