Read more.Raja Koduri is redefining this Intel dGPU to "enter the market with a bang" says analyst.
Read more.Raja Koduri is redefining this Intel dGPU to "enter the market with a bang" says analyst.
At least Intel is big enough to stick it to Nvidias GPP and promise partners whatever they need as they have their own fabs.
The big question is whether Raja is using Intel IP for the dGPUs or is using his mental notes from AMD design methodology.
Very curious to see another player in the market but concerned for AMD as there will be another player that is known to exercise anti-competitive practices at AMDs detriment. </tin-foil>
Intel [Yawn], but to be honest I don't care. We could really do with more competition in the high end graphics card markets if Intel aims for that.
Paying £700 (now £1,000 due to mining) for a 1080ti which is already 2 year old tech is getting ridiculous and out of hand.
C'mon Intel, chop chop!!
AMD will be looking any hardware through a microscope to see if Intel has been naughty, when and if released!
The thing is it isn't about what Intel can offer, because the only thing that matters is offering a commanding lead in the discrete GPU market. That is something only Nvidia can offer, and I don't see Intel upending that soon. OK, perhaps if they bundle a graphic chip with every motherboard chipset and ride out the lawsuit long enough to starve Nvidia of funds but that would take a while.
I expect the real value is that he can look over the designs for current integrated GPUs and past failures like Larrabee and say "it didn't scale up because..." so they can fix their problems.
AMD and Intel have an extensive patent cross licensing deal which almost certainly covers anything done here. Also, AMD has a technical lead in graphics and you don't stay at (well, near in this case) the front by looking over your shoulder at what the losers are up to.
True or False: an Intel GPU e.g 600 series etc beats any AMD or Intel in performance per watt (efficiency) when all are scaled to the same performance bracket. So if intel makes a card at the 1080Ti performance bracket it will use ~10 to 20% less watts.
False. Perhaps if you scale AMD graphics *down* to Intel levels they are comparable, but scaling up brings whole new problems that if Intel were capable of solving then Larrabee would have worked as a product. Just scaling up to the same level as AMD integrated required Iris pro to have an extra RAM chip included which adds a lot to the silicon area.
Or to put another way: If scaling up was easy, then SLI/crossfire would still work.
Raven ridge is more efficient at CPU bound tasks and GPU bound tasks (assuming the swift 3 and spin 5 are both configured to the same TDP)
Without more details we just don't know, the only thing I think we can safely say is that Arctic sound is a really confusing name for graphics chip
[rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/Spork/project_spork.jpg[rem /IMG] [rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/dichotomy/dichotomy_footer_zps1c040519.jpg[rem /IMG]
Pob's new mod, Soviet Pob Propaganda style Laptop.
"Are you suggesting that I can't punch an entire dimension into submission?" - Flying squirrel - The Red Panda Adventures
Sorry photobucket links broken
Intel doing discrete GPU? Hmm that sounds familiar.... anyone else remember Larrabee??
So from past experience this will be binned in 12 months time when they realise they can't beat the established competitors.
They didn't have Raja Koduri when they were developing Larrabee...
EDIT: I'm not convinced Intel were ever that committed to Larrabee as a graphics processor anyway - I think they just wanted to get into parallel compute and thought it'd be easy enough to also run graphical workloads on it (kind of backwards to how ATI and NVidia produced graphics processors that then turned out to be great for compute). The accelerators that came out of the program - the Xeon Phi Knight's Cornre/Landing boards - ended up being pretty popular in HPC and research, which are small but lucrative markets. With a GPU specialist on-board, I rather suspect they'll be able to do significantly better in that graphics market that previously eluded them...
Last edited by scaryjim; 10-04-2018 at 05:47 PM.
Intel don't do small markets, or tight margin markets. AIUI they became fairly popular due to Intel seeding by giving away entire supercomputer worths of boards in the early days so I doubt they made much overall.
The entire project was management level folly, the mantra chant that anything that can be done by any processor can be done by an x86 so if you made the SSE instructions *really* wide that included graphics.
Another player in the market can only be a good thing so long as the game is played by the rules.
possibly, and they do appear to have used some of the tech, but they were not shy about announcing early doors they would (re)-enter the graphics card market, only to back peddle months later. Or if they didn't announce it that way IIRC it was certainly how tech sites reported it at the time.
Arctic sound. What does the arctic sound like? Lots of rushing wind and a grunting polar bear? So noisy fans and a growling snarl? At least it's not too known for RGB LEDs. Unless they take inspiration from the Northern Lights...
There are currently 1 users browsing this thread. (0 members and 1 guests)