Read more.It is likely to launch at the GDC 2018, which runs from 19th - 23rd March.
Read more.It is likely to launch at the GDC 2018, which runs from 19th - 23rd March.
So Volta,Ampere and now Turing. Busy Nvidia have been,LOL.
Oh god, I can see it now: "play games on the GPU powered by AI!"
What, so I can have someone else other than my wife judge me on how terrible I am while I play, no thanks!
Hal Voice: "You are a terrible shot, dave. I can't let you play any more, dave, you're too bad, git gud"
Turing as a name concerns me.... as the Turing test is a piece of history.
perhaps they intend to use it for AI driving cars.....
Originally Posted by Advice Trinity by Knoxville
Mr_Jon (14-02-2018)
To call it Turing seems to be a bit out of the blue what with Volta and Ampere both being units of measurement and having know about them for sometime via the rumor mill.
Erm .... is this comment entirely serious?
This is a continuation of a decade-old scheme of naming architectures after scientists/mathematicians, which stretches right back to Tesla (G80). The only slight shift is that previously announced architectures have generally been named after physicists (or mathematician/physicists, to be more precise); Turing, amongst many things, was never really a physicist. That could be indicative of a shift in architecture focus, or it could just be that they'd run out of physicists with cool-sounding names...
It's not really a deviation from the naming scheme they've been using since Tesla i.e. names of famous scientists/engineers/mathematicians.
Edit: Beaten to it...
I got hung-up on the units of measurement didn't I.
I really hate myself for laughing at this.
Also, anyone else worried about the performance gain ? With so much being made from previous gen, and any sort of performance boost being welcomed by those using these for monetary gain, I am genuinely concerned that NV will just put out a bare bones performance upgrade. Also the prices..god damn the prices are gonna suck.
This statement comes up all the time, and it mostly ignores how processors are designed and manufactured. Modern CPUs/GPUs are in development for several years, and architecturally they're mostly finished something like a year or more before they hit the market - after that it's down to physical implementation, ramping production, testing, binning, etc. Even the fabrication turnaround time for an in-production part is measured in months, 3 or more for current nodes. That last part is partly why AMD/Nvidia are very wary of ramping production for mining demand - they'd basically betting millions on expecting the market to still be there in three months or so and to the scale they predicted. Get it wrong, they stand to lose a lot if they have to write-down stock like AMD had to with the last Litecoin craze.
What this basically means is, R&D and architectural performance are absolutely not last-minute (or even last year) decisions - you play your best hand or risk losing a ton of market share. Just like there is/was no 'magic CPU' for Intel to pull out in response to Ryzen (they're still fundamentally Skylake cores) like some fanboys claimed, who blamed AMD for Intel's lack of progress. And also why, after hiring new staff then later announcing a departure from the Bulldozer line, it still took years before we saw Ryzen.
What can happen, is messing with product positioning and/or pricing at the last minute, and perhaps holding back production of largest die GPUs of the same line (if you're reasonably confident it's sane given the competitive landscape) until yields improve for example, rather than ramping a low-yielding part and eating into margins. But it would very rarely make sense to not release it at all in some way given the R&D and other associated costs. The lower-end tend not to be released until a while later nowadays too, presumably due to them targeting a lower-margin market where again, it's beneficial to wait for good yields.
Corky34 (14-02-2018)
If Nvidia continues to increase performance with their new chips, as they have in the last 3 or 4 years, it leaves AMD with a mountain to climb regarding the GPU market. Their efforts with the Vega architecture were less than overwhelming.
So, retailers/wholesalers are the ones benefiting the most, as I understand?
Why not get them to share the risk - i.e. get them to pre-order batches of GPUs. I realise the complexity of the supply chain, but with such ridiculous demand, surely both AMD and NV have power to insist that their partners share the risk.
Su
The name would strongly imply its a Crypto Mining GPU.
There are currently 1 users browsing this thread. (0 members and 1 guests)