Read more.This teaser video is the first post on a new @IntelGraphics Twitter account.
Read more.This teaser video is the first post on a new @IntelGraphics Twitter account.
Looks very Vega 64 esque
ftfyintel teases exit from discrete graphics cards (again) for 2021
Hard to see how Intel will manage to find a space in a highly competitive market which Nvidia and AMD seem to be filling.
https://www.reddit.com/r/Amd/comment...the_vega_m_gl/
LMAO,no way to report driver bugs even though Intel is in charge of drivers for the Vega M GPUs they use. They really need to put more effort in driver support.
Well they already do add-in coprocessors, probably not much work required to rejig these to being more output and consumer focused.
And with nvidia adding an integer processor to their RTX chips, that's treading on CPU toes.
RAJA now has access to billion of dollars and worlds most advanced fabs lets guess what he will come up with.
Iota (16-08-2018)
I.m sure they'll get there eventually but I doubt the 1st,2nd or 3rd generation of their cards will trouble the top end Nvidia GPU's of 2020.
Those add in co-processors are essentially a lot of Atom cores on a single chip. They're not going to accelerate modern game engines to any meaningful extent. Far more likely that they're starting off from the cores in their IGPs and seeing if they can scale them up to large GPUs
Not convinced that follows. CPUs do a lot more than just a few basic integer calculations. The whole point of GPGPU is that the GPU is a simple but very wide processor that's ideal for running simple instructions over and over again. There's a definite progressive logic to adding dedicated INT processing in the same vein. I mean, theoretically you've been able to run INT calculations on a GPU since GPGPU first started - you'd just have to cast to floats and back, and you'd risk some inaccurate results due to precision and rounding. Adding native INT cores to a massively parallel processor just makes those calculations quicker and more reliable. It doesn't change the principle that you're doing simple parallel tasks repeatedly, rather than interpreting complex branch logic...
It would make more sense for them to keep focus on AI specific products, but I guess the offshoot of that is to start treading into both Nvidia and AMD areas in terms of multiple gpu cores for other markets. I just don't see the sense in them committing to discreet desktop graphics cards, especially as they'll also have to do a lot of legwork for drivers and building up that platform, as well as additional resources for developers etc etc.Intel is being squeezed from multiple angles with regard to its CPU output; Arm looks to have won the mobile war (and via Qualcomm is pushing into Cellular PCs), and AMD is competing very fiercely for consumer PCs and the lucrative x86 workstation and server market.
Certainly they could, but it's a big risk when there are already two well established players in the market. Be cheaper for them to buy Nvidia
Someone did a really good explanation why this isn't relatable but I'm struggling to find it so I'll paraphrase what I remember. They used the terminology that the on die GPU shares a lot of the co resources as the CPU as well as it not connected in conventional means. So therefore just carving it off and scaling it up won't really work because that GPU silicon running on the CPU is designed to be scaled down and within an inch of its life for max performance. It would be an interesting precedent if a product that was designed to be small and sip power bolted to a CPU could be detached and scaled up to a dedicated card territory.
From what I understand, the design used in the Intel HD on chip GPUs just aren't suitable for breaking out and scaling up, it just wasn't designed with that in mind.
Nvidia created a graphics unit that they could put into Tegra mobile & tablet chips, and also use it in their desktop cards to great effect. So it can be done, but the difference here is that Nvidia are good at doing graphics whereas I doubt Intel can pull off the same trick.
There are currently 1 users browsing this thread. (0 members and 1 guests)