Read more.Built on 10nm, some of these first GPUs are confirmed as including ray tracing hardware.
Read more.Built on 10nm, some of these first GPUs are confirmed as including ray tracing hardware.
Nice, looking forward to see if Intel can disrupt the GFX market.
AMD tried making a one size fits all architecture and we got Vega. It was very good in the data centre market minus the fact the majority of software is preferably/solely Nvidia. But it left a lot to be desired in the gaming market.
Which was a Raja Koduri design...
Now we have Intel seeming to do the same. I hope Raja has learnt from his mistakes.
If he have not, then Intel have definitely hired the wrong guy.
But at this level ( minus when politics are involved ) you are only allowed to make the same mistake once.
So, 14nm++++++++++, I guess.Originally Posted by Article
BTW, I'm still not sold on the whole "June 2020" rumour. If you look closely you'll notice that *all* three vanity license plates in that article list "JUN" as the month. I'd hazard a guess that that means that the plate needs renewal come June and nothing else.
I'm also not really counting on Intel to "disrupt" the graphics market. They've tried several times before, but they've always failed. The only reason Intel is a massive player in this market is due to the integrated graphics solutions in most of its CPUs. (Slightly OT: Why doesn't AMD do something similar with the Ryzen 3000 CPUs that only contain one CCD?)
Intel will likely fail in their quest to take over desktop GPU market. It's a tough business with two major major players already.
They are probably trying to trickle back expertise and technology to their integrated GPU and CPU people strategically as they are actually worried about AMD APUs. I have one I need to try in the other room.
AMD will certainly build on the CCD principles and massively improve on package graphics for processors in the future. Their mainframe level experience and design values help them here. What if in the future:
we all want low wattage high performance low size APUs and most games run at 4k and everyone ends up happy in 4k
AMD wins.
hexus trust : n(baby):n(lover):n(sky)|>P(Name)>>nopes
Be Careful on the Internet! I ran and tackled a drive by mining attack today. It's not designed to do anything than provide fake texts (say!)
Intel's strategy is early a preemptive strike (very late preemptive strike) on AMD.
Obviously Nvida pose a huge threat, but I'd honestly say AMD are the real issue, potentially causing catastrophic damage to Intel.
Intel have tried to buy their way into other sectors I the past, I feel the GPU sector is a better bet for them, but ultimately, I don't feel they can make a realistic impact.
I believe the recent rumour that they will integrate the GPU, not only for the reasons listed, but mainly as I'm sure they realise this is their only realistic way of penetrating the market (I'm aware they have integrated GPU currently, yes)
For company with no real history / reliability or knowledge in GPU, they would have to really hit the ground running and somehow really undercut AMD AND Nvidia, not to mention all the marketing costs.
Millennium (17-10-2019)
I'm not convinced it's the desktop discrete GPU market Intel are really after.
The money maker is in render farms and compute solutions a la CUDA, I suspect the thinking is well we may as well do a cut down version for iGPU and produce a few discrete boards and see how they fair.
But they do have history in the GPU market - just a really poor one. When you consider how many complaints (maybe unfairly deserved) AMD/Nvidia get with their driver updates Intel have always been on a far lower level! Rarely updated, barely supported and then if you have an OEM special zero updates despite there being a newer driver for your product (Looks at work's Lenovo box under my desk). Add in all the previous comments about delivering a competitive iGPU with the next CPU which has never happened I am just not expecting Intel to deliver.
Correct. I specifically mentioned Larrabee because it very much reminds me of what they're trying to achieve now. It was supposed to be a GPGPU, then it was relegated to pure computational work (Xeon Phi) and now it's gone altogether, with the last iteration being Knights Mill from 2017.
Yes they have. They bought a graphics company, marketed the technology as Intel's, created lackluster discrete graphics cards which didn't sell all that well until they killed it off. Google "i740".
Then with Larrabee they decided they could do graphics with really wide SSE instructions on lots of Atom class cores because someone high up said that everything should run x86 instructions from embedded to graphics cards. Intel have always had a problem with drivers, so making a software centric GPU seemed a dumb move, they eventually gave up and released the product as a compute only card with no graphics output.
Intel have no problem making boards and shipping products, but the one thing Koduri could show them is how to make graphics compute scale beyond small integrated units as even their big integrated graphics hit a wall before they got to being big enough to need a freestanding card.
I'm expecting this to be expensive, hot, slow with buggy drivers and reliable hardware. If they are prepared to lose money on it it might not be expensive.
There are currently 1 users browsing this thread. (0 members and 1 guests)