http://www.semiaccurate.com/2009/12/...ture-variants/
It is SemiAccurate/Charlie, so definitely in rumor status for now.
Printable View
http://www.semiaccurate.com/2009/12/...ture-variants/
It is SemiAccurate/Charlie, so definitely in rumor status for now.
http://news.cnet.com/8301-13924_3-10...orsPicksArea.0
Confirmed on CNET. The L8trB crashes and burns, Intel's second foray into the discrete graphics is stillborn.
according to anandtech it's only the initial generation planned for retail - Larrabee project is still ongoing.
http://anandtech.com/weblog/showpost.aspx?i=659
From what I can guage from the information, the project lives on, just the current form of larabee is canned.
It's a shame really, it looked promising and would have forced other card manufacturers to bring out new GPU's instead of simply renaming their GPU cores every 6 months to another name ;)
It makes me wonder why now that they've successfully shown that 16 x86 cores over a PCI-E bus can work as a GPU, and more recently 48 x86 cores can be put into a single CPU whey they'd proceed with slowing things down over the bus and not just cram as many cores on the CPU and use that for both CPU and GPU?
Because x86 is simply far too inefficient and expensive to cram a crapload of cores into a PCIe card (or even a CPU package) and call it a competitive GPU.
I think Intel found out breaking into the high end GPU market is a bigger project than they thought. This was difficult for them for the same reason making a CPU would be difficult for ATi or NVIDIA- their core expertise and experience is elsewhere.
It's not that it's a project too big for Intel. It's that Intel engineers are being hamstrung by the idiotic x86-everywhere mentality of upper-management, or 'not invented here' syndrome.