Read more.The chip-maker would do well to steer clear of tablets for a couple of years.
Read more.The chip-maker would do well to steer clear of tablets for a couple of years.
I'm just saying I don't think it's a good idea for AMD to divert too many resources to tablets right now.
With Zacate they already have an integrated CPU and GPU that can be used in higher end tablets with minimal developmental costs.
The problem is Intel doesn't divert a single resource to produce Atom over Sandybridge, or the next two generations which I forget the code name of. They just put different people on it, and thats where they are winning and AMD are losing.
Core 2 duo came out of a low power mobile chip project run by an entirely different team. But this focusing on low power is what lead to Core 2 duo, and now Sandybridge, and Atom's focus on power will leak into other projects.
The problem is, Intel will have a desktop design team, a mobile, a server, a chipset, a ssd and several other teams, and it will delegate more teams to focus on gpu production, hardware acceleration, quicksync.
Thing is AMD will have the same give or take a few of the main teams, but if you're paying 30 people 200k a year and providing them with a 200mil lab and wafer runs in an expensive test facility to work on getting the best power gating, Intel can introduce that into millions of chips across their entire spectrum of products. AMD's costs are spread over a MUCH smaller pie and much fewer products.
Its the diversity and going after different goals that brings results. Intel's goal with core 2 duo was a manageable low power chip, the side effect was massive performance/w to the point where it became a much better platform for the desktop.
You need to be focused on many different area's to make advances in every area of chip production.
AMD are in debt but making money, and have HUGE backing by a company owned by trillionaires that wouldn't be in trouble, but whose biggest customer by a country mile is AMD, they will NOT see AMD go down and they own a good 10-15% of AMD.
Basically AMD can afford to spend a little more and they could have expanded 2 years ago with more engineers and created an entirely different tablet/mobile team, well they could have just not sold their existing one for 65million, basically the worst bit of business AMD have done in the past 3 decades. Right as mobile goes big AMD sold a lot of IP, engineers and products just on the cusp of making it large scale in the market..... oppps.
The question is, would those products, and the research and improvements there, have made their way into Bulldozer, Llano, ONtario and the next 10 years worth of products, answer, probably.
You don't have to not focus on desktop to go after the mobile sector.
AS for stock prices, Nvidia stock prices going up and down over a Nvidia rumour or talk of their massive sales of Tegra 1 are pretty infamous. The tech sector is fickle and mostly based on rumours and lies and not actual performance, but then the stock market in general is pretty much a sham so thats not surprising.
Point taken about selling handheld to Qualcomm, but presumably it was for the same reason: AMD didn't feel it had the resources to do that, and everything else, properly.
AMD will be banking on Intel not being able to catch up on graphics, no matter how much money it chucks at it.
Here's what AMD had to say on the issue when I quizzed them last week.
"One Bobcat core is designed to be sub-1W-capable.
AMD announced two days ago (Feb 28) an embedded version of (dual-core) Bobcat for headless applications, no display, where you disable the GPU complex, and you have a 5W chip.
Looking down the horizon, there are a few things that say, yes, we'll continue driving down the TDP. One will be moving to an SOC-like implementation (bringing more on to the die, reducing power-draw). 28nm (Krishna), the successor to Brazos, will help lower consumption, too.
Our technical people will say that having a sealed device, like the iPad, with no active cooling requires a TDP of no more than 5W. However, AMD has not yet announced a specific tablet roadmap, and we've not stated that a tablet-oriented chip is ready, that we've done these five things to it, to make it a tablet part. That's coming."
Thank goodness that Hexus at least has some balanced reporting and opinions especially after reading this so called opinion:
Scott B (08-03-2011)
Hmm, I think that will be a high clocked bobcat for headless applications as they've previously announced a cut down(various bits like USB and other things) but 1ghz clocked and 240Mhz gpu, so basically the lowest end dual core bobcat and got that down to 5W, binning and cutting out a few bits but almost full other functionality. 28nm will frankly bring that down.
Where AMD will do well over time with bobcat based architectures and future revisions is, well ARM is lower power now but has a significant way to go in gpu/cpu power to catch up to bobcat, while AMD has huge performance for a tiny chip but its a bit too high power.
But they'll converge in couple years, 3 maybe, ARM need to year on year double performance meaning 40nm, then 28, then 22nm they are really going for the same TDP but using the node drop to increase performance per/W, but say each chip each generation is 1-2W but doubling in speed every node.
Bobcat though, already has that performance, and can do 5W at 40nm, at 28nm they can probably get that down to 3W or so, then at 22nm they'll again be able to keep roughly the same overall performance as they have now, but suddenly that performance will be in a 1-2W bracket.
Obviously ARM can drop power each node but in doing so they'll have limited performance increases, the real question is, at what stage do phones run out of the need for performance. People don't want to drag around 10" screens on their phones, and beyond social networking, while having your e-mail open and browsing the footie scores and a shop and some light gaming at what point does performance become worthless in phones.
Scott B, I do agree that I think Meyer decided they couldn't persue it and 65mil was better than nothing, but it was insanely short sighted, not least because if they'd continued on that same company to sell off would have fetched them 5-6 times as much, if they'd made an early impact in tablets/phones with the IP it could have been worth 20-30times as much and be raking them in hundreds of mils a year.
I think frankly, for 65mil on a company with 4billion debt, and part owned by trillionaires, it was a pretty low risk gamble to hold onto it for a couple years just in case, and I wouldn't be surprised if thats ultimately why Meyer is gone.
As for Intel, while I doubt they will catch up anytime soon in driver quality, outright performance, the one thing I was hinting at before is, Intel are trying to get acceptable performance out of 0.5w chips with a gpu on, its that power research and acceleration, shortcuts, its those kinds of things that enabled them to leap from Netburst to Core 2 architecture.
AMD is working on those things but not anywhere near as hard, and for APU's absolute performance might not be the key but as AMD's own Llano demonstrations, useability and power draw, performance per watt.
I can see Intel trumping AMD in the not to distant future on the power draw of their gpu's, and that might be enough to win over the market. Frankly Intel have come on leaps and bounds, the Sandybridge IGP isn't blistering, but it ain't half bad and is just a massive step in a short time from their old offerings, and in terms of quicksync, its something thats magnitudes faster than AMD with transcoding, again thats a pretty massive leap forwards.
Where did quicksync come from? likely research to find ways to get a 3W chip be able to acceptably decode high def video with hardware acceleration in as small a die space as possible......... many more innovations like quicksync and AMD could be in real trouble.
After all we currently just get a single hot patch on the back of your device, instead you'd have a uniform(ish) 'warm' feel to the whole back panel. I was tempted to try this in a mod with as laptop/tablet but I really cba !
The only reason that the current Intel IGPs do so well against the current AMD ones is because AMD basically did not want the IGPs reducing the sales of their lower end graphics cards. The HD4290 is basically a tarted up HD3200 and the latter was released in early 2008.
I don't think you can say AMD is in a worse position for tablets than Nvidia because AMD has a metric s*** tonne more of investments which could easily be used to buy out Qualcomm which would easily be one of the best investments ever
I think that viability of AMD in the tablet market depends on Intel's success there. A tablet with a touch-centric Windows version could be attractive due to the usual reason of application compatibility. All that's needed is that Windows version and the low power CPU's to support it. Intel and Microsoft are working on this, and if they succeed, AMD will have an easy time getting in assuming it can get power draw low enough.
Regarding the article, I don't think that staying out of the tablet market is a good plan for AMD. The difference between AMD and NVIDIA is that AMD's laptop offerings will be x86 based. This means that even if they're not used in tablets or phones they'll be used in notebooks, HTPC's, etc. So having a lower power APU with good performance means AMD will be in a good place whether Windows based tablets take off or not. If AMD doesn't produce such lower power chips, and tablets do succeed at the expense of other form factors, then AMD will lose. It's therefore a better bet to go in a direction that will keep this market viable.
Thanks for the reply, Scott. There's a difference between "right now" and "the next couple of years". Right now the tablet market is mostly Apple, but right now AMD doesn't have anything for that market anyway. Two years from now the market may be different, like it is in phones, where Android gained traction quickly. AMD should have a strategy for that. I agree that it shouldn't make tablets its main target, but that's no more likely than Intel discarding everything but its tablet/phone strategy (which it does have).
As for NVIDIA, I don't think there's any reason to point and laugh at it. Getting into a new market isn't smooth sailing, and NVIDIA actually did better than expected. I think that in the long run NVIDIA will be better with a mobile CPU strategy than without it.
There are currently 1 users browsing this thread. (0 members and 1 guests)