Yeah it did admittedly look very suspect; everything from the node, TDP, voltage and socket don't seem right. It's a good effort though! xD
I don't suppose I posted that link after it already said 'debunked' at the top?
Yeah it did admittedly look very suspect; everything from the node, TDP, voltage and socket don't seem right. It's a good effort though! xD
I don't suppose I posted that link after it already said 'debunked' at the top?
If this is true, the $2B is a lot of money to spend to fail to meet your sales targets!
http://www.fudzilla.com/home/item/35...let-soc-target
Maybe the OEMs aren't as short-sighted and manipulatable as they'd hoped, and realised endangering their long-term plans by going with some freebies in the very short term wasn't too sensible a move?
With the big OEMs; Asus, Samsung, Acer, Lenovo, etc - your competitors getting a marginally cheaper deal on a single component probably isn't that frightening a prospect for them. If it was just low cost they were after, and for the smaller OEMs, there are companies making ~$5 SoCs which have the bonus of remaining that way for the foreseeable future.
Adopting the Intel SoCs, they'd potentially have to put the money into designing/redesigning the platform, both hardware and software, in order to adopt them in the first place, then hope doing so results in enough profit difference to recoup the cost of doing that vs just sticking with something they already use and are familiar with. Then they have to weigh that up long-term i.e. if it makes no real difference to their profits short-term, then they have to switch back again, it's counter-productive. Also, if they choose to stay with them, they have to consider either lowering profits or raising costs for future devices to compensate for the higher BoM.
Essentially, it seems it may not be as simple as just bending the law to dump your product onto the market to get companies to come and sign their soul away - as I said I just can't see them being that short-sighted. It may be useful for companies already using, or intending to use, Intel's SoCs, but that doesn't help Intel even remotely. Provided Intel aren't limiting it to new adopters that is.
Intel seem to be paying for the design costs, or at least giving assistance. http://www.fudzilla.com/home/item/35...tablet-vendors
That's where the criticism was - dumping is illegal, but paying for design costs if they use their chips, while conveniently offsetting roughly the cost of the SoC per device, is a 'legal' way of achieving the same thing. I.e. free (dumped) SoC and fund design yourself vs pay for SoC and get design paid for...
Makes sense, it's the same thing they did with ultrabooks. Odd that they'd be going after a market so far from their usual high-margin core business though - you'd think they'd want to push high quality products to distinguish themselves from the budget chip-makers - or did that not work out as well as they'd hoped in the ultrabook space...?
It does puzzle me a bit, their low-end chase. If they're so convinced their brand image impacts the sale of devices, wouldn't it be counter-productive to have people associate them with low-spec tablets?
The other Kaveri CPUs are finally available:
http://www.scan.co.uk/shop/computer-...socket-fm2plus
The A8 7600 is only £75!!
Edit!!
Even cheaper on Ebuyer:
http://www.ebuyer.com/store/Componen...rice+ascending
It also appears a budget FM2+ mini-ITX motherboard with a 4+2 phase VRM has appeared too:
http://www.ebuyer.com/658041-asrock-...x-fm2a78m-itx-
Last edited by CAT-THE-FIFTH; 01-08-2014 at 08:48 AM.
@ £107 the A10 7800 makes the A10 7700k kind of redundant unless you're desperate to overclock the CPU: 4 quid more for the same CPU speed and 33% more shaders? Utter no brainer....
EDIT: I'd also love to see the 7400k go head-to-head with the Pentium AE for overclocked performance with a mid-range graphics card (say an R7 265 / GTX 750Ti) - I guess the Pentium's a bit cheaper still so probably wins out with a discrete GPU: could do with being < £50 for the 7400k to offer real value. That said, a 256 shader GPU part is pretty potent - that's more cores than the 6400k offered, and they're GCN rather than VLIW4....
The A8 7600 is definitely the star of the show IMHO,since its at worse it is probably around 10% slower than an A10 7850K,and the jump from 384 GCN shaders to 512 GCN shaders is not massive IIRC.
OTH,at around £70 to £75,its still offers similar IGP performance to a £40 to £50 graphics card. CPU performance is around A10 5800K level while consuming less power too.
Edit!!
Some more details of the new CPUs:
https://translate.google.com/transla...529&edit-text=
The X4 860K runs at 3.7GHZ and AMD says it can hit 4.4GHZ overclocked - the FX8310 looks to have a 3.3GHZ base clockspeed.
Last edited by CAT-THE-FIFTH; 01-08-2014 at 10:36 AM.
http://wccftech.com/amd-toronto-apu-...dr4-supported/
If this is true, it would likely mean Carrizo (which AFAIK uses the same die as Toronto), would have DDR4 memory controllers. I get the reasoning behind supporting DDR3 for cost reasons, and not wanting to change platform so quickly again. But if the controller is there, I wonder if we'll see a refresh or just a few SKUs supporting both? My 775 motherboard had both DDR2 and DDR3 slots, so maybe we'd see something like that?
I would like that watercooled; having the option of both makes the transition period easier. I don't like being forced to change so quickly, I have loads of DDR2 memory that I got at bargain prices that can only be used in 2 computers because I don't have a third set of components for them.
I look forward to DDR4 but I doubt it will be out in the market before 2015/2016 time frame, been talking about it for so long now it seems like it is moving at a snail's pace. perhaps there will be support for it, like the link mentions, but I am not sure we'll get lucky and have both controllers active. Maybe the support for DDR4 is not specifically for that chip, perhaps it is meant to be the sole memory chip for another processor coming out later and they are using that as a testing platform internally to see how the chip works in silicon that is being shipped.
Phenom II would be a better comparison: with s775 the memory controller was in the chipset, not that CPU, so mobo manufacturers could pick between chipsets that supported either or both. Problem is DDR4 is point to point, so 1 channel per DIMM - I'm not sure you can do a hybrid controller like AMD did with Phenom II (and if you did it would potentially be a quad channel DDR3 controller anyway!). The simple fact is that AMD are clearly going to have some form of hybrid memory controller - that's actually been on the roadmap for over a year, this isn't anything new. How they bring that to the consumer platforms is another story entirely.
AFAIK, DDR4 is already supported by intel's latest generation of high-end Xeons, and modules are already shipping in servers and workstations. Admittedly that's not the same as them being in the consumer market, but (unless I'm imagining things) they are already in use in niche markets. How quickly both Intel and AMD start supporting DDR4 in their mainstream consumer platforms is anyone's guess, but I'd be amazed if we had to wait for as late as 2016 for consumer DDR4 support. Certainly for AMD the benefits to their APU platform are just to great to delay any longer than they absolutely have to....
There are currently 32 users browsing this thread. (0 members and 32 guests)