Read more.The custom Nvidia Tegra SoC's CPU, however, maintains a steady 1020MHz, say sources.
Read more.The custom Nvidia Tegra SoC's CPU, however, maintains a steady 1020MHz, say sources.
Whilst Maxwell is a "last-gen" GPU for bleeding edge PC gamers, it is still harnessing cards that are newer than the original model PS4 and X1 GPU's, and are still being used by many on all the latest games. I think that part of the article is a little unfairly explained in regards to the overall picture it paints.
I think and hope the reason for the GPU being higher clocked when docked is due it needing a higher resolution for good results on a large TV screen, I would like to hope the overall performance and look of the games won't appear to be much different to the untrained eye whether docked or not due to the difference in screen sizes.
Makes sense, run at 720p when undocked which needs less power so can downclock and save battery, then when on the TV plugged in go all out.
kalniel (20-12-2016)
What is MSRP for Nintendo Switch?
Where I live I see pre-orders starting at 469USD which I think is too much for this kind of console.
The more you live, less you die. More you play, more you die. Isn't it great.
OH MY GOSH! OH MY GOSH! OH MY GOSH!
Notices that it has Mario... "Take my Money now!"
Have we seen 20nm maxwell anywhere else? Or is this the first outing of what maxwell was meant to be?
Tegra X1 released 18 months ago is 20nm. 20nm wasn't that good for desktop devices, so it only got used for mobile things (like this).
I presume Nintendo are getting a really good deal on an 18 month old part, but I'm not sure old bargain bucket tech it is a good idea in this market.
Preventing AMD being the SoC provider for all three major current gen consoles? I suspect they got better than really good...
OTOH AMD don't really have a product playing in this power envelope, and (afaik) they've not done any semi-custom work at this level yet. I suspect they could, and I know they want Zen to be a top-to-bottom architecture that can play down to very low TDPs, but if I was Nintendo I wouldn't want to be betting the company on AMD's first single-digit-Watt high-performance SoC...
I think Nintendo should just do games, no hardware anymore.
they are good with games that many people like (not me though), but on the hardware side they have trouble getting the right one.
The more you live, less you die. More you play, more you die. Isn't it great.
I'd say Nintendo _do_ get the hardware right, they just don't compete on the highest FPS and raw processing power. They produce the most usable hardware.
I've only ever bought Nintendos as the others are just an exercise in playing games that look like they were on the PC a couple of years ago with a rotten control system. Nintendos have always provided a completely different experience than the PC, and sure, the games follow that.
Nvidia rightly doesn't care who owns console chips. If you can't make enough money, or don't want to rob from CORE R&D, you should avoid consoles as NV did for the first two. I suspect (@scaryjim) NV gave them away at a good price because it cost very little to simply rejig the old R&D (maybe just shut some stuff off to sell chips stuck on a shelf). If they sell well, NV can always tape out a 10nm shrink and Nintendo can call it a + model or something. I think Nintendo really should've waited for a pascal model but that's just me and wanting FAR more capable hardware than most stuff needs for a while. But NV is pretty good with drivers and getting max stuff out of their products so maybe they'll be OK here. It's not like maxwell sucks, and it's possible 720p needs nothing more and 1080p has more than it needs. It would be a fail if they really sucked on the go and only worked ok in the dock. You'd rather have ok on the go and GREAT in the dock.
We see NV made a mint on investing in CORE products (desktop gpu/server/workstation gpu), rather than having it starve CORE stuff and get kicked out of the CPU race (see AMD) and basically driven out of the gpu race (see NV Q vs. AMD). AMD then had to give their cards away at such a low price they lost 400mil while selling an extra 300mil in product to get back some market share (totally stupid). If you are not making money on it, forget trying to sell the crap out of it. Market share isn't worth squat if you can't make a dime on it. NV lost 10% market share last Q but laughed all the way to the bank as they set record revenue, income, margins etc due to owning the only end that matters...The HIGH end.
AMD could surely have done the soc, but not without R&D money. They spent that wad on consoles (not once but twice for both xbox1/ps4) and current coming zen/vega. If they had skipped consoles they might have captured some of the high end income/margins NV/Intel has gained by being there. Losing both the server/workstation gpu/cpu margins was a killer blow to AMD. Console margins are crap (as noted here for NV too probably). Going console also had a massive impact on AMD's high end gpus, and they made matters worse by delaying Vega for the low end (WTF?) no margin junk.
I really hope they have a great product in both Zen/Vega and charge LIKE they perform. Meaning don't give the dang discounts if you have a GREAT product. You are in business to make money, not be our friends (even if I like a low price, charge what it is worth!). That said if AMD is dumb enough to give me a huge discount and ZEN really wins in handbrake etc, I'm buying since I seem to do a large chunk of that and more lately that pegs the cores all day/night. I have ZERO interest in 140w if I can match it for under 100w in the work I'm doing most. Games just has to get close and is usually limited by my gpu anyway even after I upgrade to vega/pascal probably. I'd happily pay Intel cost if they win apps and games at under 100w vs. Intel's 140w chips (at whatever price level). If the app tests are real (aren't monkeying around with them), I'm probably a ZEN buyer already since the only thing that would stop me is a massive hike above Intel pricing...LOL.
Sorry, but almost everything about this is wrong. The Xbox1 and PS4 SoCs used existing AMD technology - 'cat' CPU cores and GCN GPUs. Those technologies were designed to be modular and pluggable. The first GCN APU was (iirc) a lower power 'cat' core anyway. And the two chips are very similar so a lot the design costs will have been shared. It's the semi-custom business that's keeping AMD afloat at the minute...
Now, if they'd gone for the Nintendo contract they would have had to spend big on R&D, as it turns out Nintendo wanted a mobile SoC. AMD don't have existing tech they can plug in to that market.
If they had skipped consoles they would've gone bankrupt. Doing consoles didn't delay any of the high end releases, as the consoles used existing tech. If you honestly believe all this you obviously haven't looked at AMD's financial results for a year or so. The embedded and semi custom division has made significant profits for the company, without which I doubt they'd've been in a position to release Polaris at all. They've actually been running at general profitabiltiy for the last two quarters, but in Q3 this year they chose to write down the estimated costs of the latest change to their wafer agreement with GlobalFoundries. Taking that hit in one lump sum will probably make their financials look way better over the course of the following 4 quarters, so from a business point of view I can see why they'd do that.
Take that charge out of the reckoning and AMD actually had net income of $27m, with operating income of $70m, and a gross margin of 31%. (per Anandtech). And as you pointed out, they're mostly operating in traditionally low margin markets. They currently have no products at all in the higher end of either the GPU or CPU market. And yet they're turning 31% gross margin and they've increased their graphics market share significantly. Gotta say, that sounds to me like they're doing something right...
You might want to look at their debt too - AMD managed to reduce it significantly this year,and also has anyone noticed how the PS4 PRO and the Polaris 10 GPUs have EXACTLY the same number of shaders?
PS4 Slim uses cat cores and modern enough shaders on a 16nm shrink. Cut that down a bit, undervolt & underclock it to chosen power envelope.
AMD have the technology, but they would need integration and masks done which is probably a few million up front cost. Nvidia probably have a warehouse full of X1 chips seeing as they struggled to find people to get them into products (which is a shame, I think it is a nice chip).
Edit to add:
NV made the Shield console & tablet, including their own game selling eco system. That is a huge amount of work for someone supposedly avoiding the market. But until now, no games companies wanted to touch them.
The question is whether AMD should have invested all that time and money into HSA. Most processors are going to sell as an APU from now on, and some workloads are going to benefit, but it still isn't clear whether the disruption to both CPU and GPU teams that they had from that integration effort was worth it. HSA was probably the core technology that got AMD this round of consoles, but for most people playing Farmville on their laptop they won't notice or care. Intel don't seem to have suffered for their "just slap it together" attitude.
Last edited by DanceswithUnix; 21-12-2016 at 09:43 PM.
That does sound a little concerning and does make sense why nintendo were trying to market it as more of a console over a handheld. Hopefully you'll still want to use it as a handheld, 60% less GPU clocks seems like quite a difference. Though on the other hand, at least it means that docking it doesn't limit the entire console's potential.
There are currently 1 users browsing this thread. (0 members and 1 guests)