Intel is still better than Apple, although I probably would have preferred seeing him at Tesla. :)
Printable View
I don't believe he will go to Intel or even nVidia. I have gut feeling intel will have said to AMD "you want us to buy & embed your chips, then get rid of xyz". It all seems convenient timing. They've bullied for years and they'll never change.
Its official:
https://newsroom.intel.com/news-rele...i-joins-intel/
Quote:
Raja Koduri Joins Intel as Chief Architect to Drive Unified Vision across Cores and Visual Computing
Quote:
Intel to Expand Strategy to Deliver High-End, Discrete Graphics Solutions
Quote:
SANTA CLARA, Calif., Nov. 8, 2017 – Intel today announced the appointment of Raja Koduri as Intel chief architect, senior vice president of the newly formed Core and Visual Computing Group, and general manager of a new initiative to drive edge computing solutions. In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
Billions of users today enjoy computing experiences powered by Intel’s leading cores and visual computing IP. Going forward under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing.
“Raja is one of the most experienced, innovative and respected graphics and system architecture visionaries in the industry and the latest example of top technical talent to join Intel,” said Dr. Murthy Renduchintala, Intel’s chief engineering officer and group president of the Client and Internet of Things Businesses and System Architecture. “We have exciting plans to aggressively expand our computing and graphics capabilities and build on our very strong and broad differentiated IP foundation. With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
Koduri brings to Intel more than 25 years of experience in visual and accelerated computing advances across a broad range of platforms, including PCs, game consoles, professional workstations and consumer devices. His deep technical expertise spans graphics hardware, software and system architecture.
“I have admired Intel as a technology leader and have had fruitful collaborations with the company over the years,” Koduri said. “I am incredibly excited to join the Intel team and have the opportunity to drive a unified architecture vision across its world-leading IP portfolio that help’s accelerate the data revolution.”
Koduri, 49, joins Intel from AMD, where he most recently served as senior vice president and chief architect of the Radeon Technologies Group. In this role, he was responsible for overseeing all aspects of graphics technologies used in AMD’s APU, discrete GPU, semi-custom and GPU compute products. Prior to AMD, Koduri served as director of graphics architecture at Apple Inc., where he helped establish a leadership graphics sub-system for the Mac product family and led the transition to Retina computer displays.
Koduri will officially start in his new role at Intel in early December.
Just seen it being made official, i find it madness that he can switch to what is effectively a competitor to AMD as a whole. But seeing as he has let the RTG stagnate as bad as the CPU side, maybe fresh blood at the steering wheel will be good. I wonder because of the results of Vega maybe he was "let go" :P
If Lisa can inject some of her drive and passion into RTG like how she did the CPU division then maybe Navi will be a good successor!
Jim Keller when he joined Tesla,not only hired some ex-AMD people,but it apparently lead to this:
https://www.cnbc.com/2017/09/20/tesl...-with-amd.html
AMD collaborating with Tesla on making a new chip. This probably indicates its more likely Intel will work closer with AMD,not only in licensing IP,but developing some aspects of the physical GPU development.
After all if Intel already had a decent uarch already they wouldn't be using a custom AMD chip for their high end laptop chip,so it looks like a different direction to what they were doing before.
If anything,as the chap over at HardOCP said,Intel has let go of a lot of its graphics related engineers.
So that indicates to me,they intend to either:
1.)License and use AMD base technology to implement their GPUs. AMD developes the base uarch and Intel implements their own chips based on it different to AMD.
2.)Intel buys RTG so gets all the engineers and IP.
3.)Intel buys part of RTG and shares in R and D costs.
4.)AMD developes the graphics parts of Intel CPUs.
5.)Intel developes their own base technlogy and design.
The last one has not worked as well as they probably expected,hence why the sudden change it seems.
Ooh, could this mean intel will throw more money at trying to break into the gaming GPU market?
Yup, they were caught out with the rise of mobile and thoroughly beaten by arm in that space. The rise in machine learning and AI doesn't seem like it's going to disappear anytime soon and Intel don't want to be left behind again - currently AMD and particularly Nvidia are trashing them in that space and I think this is Intel's response.
I guess it can't be helped, but I wonder how Lisa's taking it. Imagine, you send your top-engineer (head of an entire sub-brand) to work with a rival, and the next thing you know, he's taking leave while his contract is being terminated.Quote:
This is no way a surprise given the close work Raja must have done facilitating Intel's move into using semi-custon Radeons for upcoming H-Series Core processors
Kudos to Raja on pulling it off, but I'd wager those at AMD aren't too happy? :O_o1:
I wonder if cancelling Knight's Hill is one of his first acts: https://techreport.com/news/32829/in...xeon-phi-chips
Whilst it sounds like the problems with Intel's 10nm process killed it, you usually need a plan B to switch to before you pull the plug on something like that.