I was talking about Samsung's custom core. That talk about apple 'buying' a fab could be over-exaggerated - more recently it's been linked to this glofo deal.
I was talking about Samsung's custom core. That talk about apple 'buying' a fab could be over-exaggerated - more recently it's been linked to this glofo deal.
Oops I read that wrong, a moment of dullness on my part
It does seem a lot of effort to go to, but then Samsung are the leading phone and probably before long tablet vendor atm so they probably feel they have to offer more than some off the shelf Rockchip/Mediatek part to keep a price premium. If they can sell to others, that might be a bonus but I don't think they need to to justify their direction.
Custom ARM core or custom AMD core? I know Samsung somehow got away with rebranding the SoC in the Ativ Lite as a Samsung APU when it was clearly an A6-1450 (AFAIK they didn't make any customisations to it) but it does potentially show Samsung's potential interest in the AMD semi-custom business...? Be very interesting to see what Sammy could come up with if they did start making semi-custom designs for their laptops...
I'm just wondering because I haven't found anything; and what I've found usually pertains to laptops; I was thinking of a mITX build (hence the FM part) with the A10-7850K and maybe a R9 270X or 280X (probably the 270X, more budget friendly and most likely to fit in my current case of choice, the Silverstone SG05/06). Should do me fine for 1080p gaming and Hybrid Crossfire support isn't clear at this point.
The question is; do AMD desktop APUs engage in graphics switching if you have a dGPU alongside?
I don't think so, no. Years back nvidia had power management software that would switch between certain 9000 series cards if you had an appropriate nvidia IGP motherboard, but AMD never produce their equivalent. I think nowadays it's be pretty pointless, because idle power management is so much better now that it was even 5 years ago: Entire PCs with top end graphics cards will idle out at around 50W now: the complexity of turning off the dGPU (although I guess AMD could use zero-core) and switching to IGP output, and the complexity of routing outputs in such a situation, outweighs any potential benefit on the desktop (of course, in laptops it's a slightly different matter because every watt does matter - you're often talking about *peak* power draws around 60W, with idle in the teens, and in that situation saving a couple of watts could mean an extra hour's battery life...).
AETAaAS (21-11-2013)
ARM. I can't seem to find the original slide ATM, but IIRC it suggested the next Exynos would be an ARMv8, ARM IP core i.e. A57/A53, then beyond that a custom core like Apple/Qualcomm.
Edit: Not sure if this was the page I read it on, but it's the same slide: http://news.cnet.com/8301-1001_3-576...t-mobile-chip/
Makes sense. I was curious since sometimes my laptop doesn't properly switch over and it's a pain to complete loading a game only to be greeted with 3 FPS because you're stuck on the IGP. And with the possibility of tacking the word 'green' onto any new consumer product giving marketers stress incontinence, I thought they may have had a go.
I'm not that bothered since, as you say with ZeroCore, AMD cards generally idle well. It just felt a waste to have that iGPU and not do something with it. Maybe HSA optimised applications or TrueAudio will help me sleep easier.
Well, a lot of the slides for Kaveri seem to imply that the CPU can dispatch threads to the GPU and vice versa, and HSA should make it easier for programmers to use the FP capabilities of the IGP so it might well start being used for physics or other calcs via OpenCL etc. But that will require software support, so don't hold your breath
The thing that gets me most excited about Kaveri is that it suggests I made an excellent call picking up a low profile 7750 a while back, because I'll have the enviable option of upgrading my SFF desktop to a top-end Kaveri with dual graphics Although I might hold off and see what happens with the 65W parts - that'd be even tastier, IMNSHO
http://www.techspot.com/news/54763-r...-included.html
According to the source, the Broadwell-k parts will have 2MB less L3 cache than their predecessors. I see they're planning to use Crystalwell, but that's still far slower than even L3 cache; I wonder if the size and ~50% access latency vs main memory will offset it? I can see it depending on workload.
Either way, Crystalwell still seems like a strange choice for an enthusiast k part considering the IGP probably won't be used. Maybe they're just not planning to have a 8MB cache die (smaller die better for yield?), and are basically just selling a binned mobile part as a k model. I wonder if Crystalwell means it'll be sold for extortionate prices?
Its a sign of cost cutting. First the use of Core i5 dies for Core i3 CPUs and now what is the chance that ALL the Core i5 and Core i7 CPUs(and maybe the Core i3) CPUs,will use the GT3e die??
BTW,it seems one or two of the posts on Anandtech forums were trying to negate what Hans de Vries said,but they seem to lack any technical knowledge(like he has) and just are spouting numbers without understanding what they are. Plus the "unbiased" mods had to try and infract him for a later post. This is the same chap who has published die analysis and core size charts,etc and he has a technical background. They should feel privileged a person of that calibre would post on that forum. It also just shows you that Anandtech forums is basically a load of Intel fanbois and employees (with some normal people) egged on by mods who are fanbois and employees too.
That is one of his charts.
He also discovered that the production SB Core i5 CPUs were larger than the published die sizes:
http://www.chip-architect.com/news/2...es_Sandys.html
This is the type of chap we are talking about:
http://www.theinquirer.net/inquirer/...-chip-analysed
http://www.theinquirer.net/inquirer/...in-of-amds-k8l
Last edited by CAT-THE-FIFTH; 22-11-2013 at 10:02 AM.
maybe ask Hans to wander over here and post?
Mostly the tables are marketing Crossfire is actually surprisingly flexible about what you can match with what. I've seen benchmark figures of an A10-6800k crossfired with a 7750, and that's two completely different architectures working together. A quick google for "dual graphics 5800k 7750" turns up plenty of examples of it work. If they can do crossfire between a 384shader VLIW4 iGPU and a 512shader GCN dGPU, someone's up somewhere if a 512shader GCN iGPU doesn't crossfire with a 512shader GCN dGPU
EDIT: just for fun, here's Tom's Hardware review, including 6800k + 7750 figures.
EDIT2: Hmm, interesting, I'd not really read the article fully before. It seems, currently, at least, that dual graphics doesn't benefit from the frame-pacing enhancements brought in recent catalyst driver updates. Will have to kep an eye on that before I consider upgrading...
Last edited by scaryjim; 22-11-2013 at 01:35 PM.
Crossfire with Kaveri will be with the same shader type cards,so should give better results IMHO.
Perhaps, but if they're not going to add frame pacing fixes to Dual Graphics setups then they'll never get smooth performance out of them. tbh I don't really understand my AMD haven't rolled it into their frame pacing fixes for normal Crossfire set-ups: if they've got a fix for frame pacing why hasn't it been rolled out across all affected systems? With Kaveri's iGPU getting up to a performance level where crossfiring it actually makes sense (i.e. it should take performance from acceptable at 1080p to smooth at 1080p), you'd think it'd be a higher priority...
Might see,but he does not post all the time on forums.
However,Anandtech forums is having a nerdgasm about the latest 14NM information released today from Intel. The funny thing is that at every new node and CPU release,you can see the same nerdgasm and cries of how Intel will destroy all CPU competition and competing fabs,and how having them as the only company making chips and having fabs will usher in a new glorious era.
So far:
1.)Intel will destroy all CPU competitors in EVERY area this year,and they will soon be out of business.If not this year,then next year. If not next year,the year after.
2.)Intel will destroy GPU competitors in EVERY area this year,and they will soon be out of business.If not this year,then next year. If not next year,the year after.
3.)Intel will destroy fab competitors in EVERY area this year,and they will soon be out of business.If not this year,then next year. If not next year,the year after.
4.)Intel will destroy compute cards competitors in EVERY area this year,and they will soon be out of business.If not this year,then next year. If not next year,the year after.
5.)X86 will destroy ARM and MIPs in EVERY area this year,and they will soon be out of business.If not this year,then next year. If not next year,the year after.
6.)Intel will rule the tech world this year.If not this year,then next year. If not next year,the year after.
7.)Prices will be low and affordable for everyone once all competition is destroyed.
Last edited by CAT-THE-FIFTH; 23-11-2013 at 12:45 AM.
There are currently 69 users browsing this thread. (0 members and 69 guests)