Results 1 to 15 of 15

Thread: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    31,709
    Thanks
    0
    Thanked
    2,073 times in 719 posts

    Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Ice Lake's Gen11 graphics will feature up to 64 EUs and a 4x larger L3 cache.
    Read more.

  2. #2
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,160
    Thanks
    297
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus TUF B450M-plus
      • CPU:
      • 3700X
      • Memory:
      • 16GB @ 3.2 Gt/s
      • Storage:
      • Crucial P5 1TB (boot), Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • EVGA 980ti
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Long overdue! If it's a U suffix that's pretty good, and I'm sure it'll get an MSRP twice the 2700U. If it's a C suffix chip they really should do better, but it's intel so they'll charge double the cost of the red chip anyway

  3. #3
    Long member
    Join Date
    Apr 2008
    Posts
    2,427
    Thanks
    70
    Thanked
    404 times in 291 posts
    • philehidiot's system
      • Motherboard:
      • Father's bored
      • CPU:
      • Cockroach brain V0.1
      • Memory:
      • Innebriated, unwritten
      • Storage:
      • Big Yellow Self Storage
      • Graphics card(s):
      • Semi chewed Crayola Mega Pack
      • PSU:
      • 20KW single phase direct grid supply
      • Case:
      • Closed, Open, Cold
      • Operating System:
      • Cockroach
      • Monitor(s):
      • The mental health nurses
      • Internet:
      • Please.

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Looks like they're testing the waters to see if they can worry the big boys at the low end.

    Maybe also a bit of a teaser to make us wonder what they've got in the oven.

    My concern is the market clout they've got and the power they have over OEMs. If Intel come out with decent discreet graphics I can see AMD's GPU production becoming very vulnerable. Intel are the kind to use every advantage they have got, whether it constitutes as fair play or not.

  4. #4
    Registered+
    Join Date
    Jun 2015
    Posts
    99
    Thanks
    0
    Thanked
    3 times in 3 posts
    • meuvoy's system
      • Motherboard:
      • MSI Z97-G45
      • CPU:
      • Intel Core i5 4460
      • Memory:
      • 8GB Kingston 1866Mhz
      • Storage:
      • Adata 128GB SSD + 1,7TB HDD (inTotal)
      • Graphics card(s):
      • Nvidia Geforce GTX 970
      • PSU:
      • EVGA 750B2
      • Case:
      • Cougar 6GR1 Evolution
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • 29" 21:9 Philips Brilliance 298P4QJEB

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    At least we know intel can compete with AMD, let's just wait and see if they will be able to compete with Nvidia with their Gen12 Graphics chips.

  5. #5
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    What we have to remember is that these are integrated graphics which have very difficult builds and functionality from discrete so there might not be much transferrability. But it does bode well for their discrete department because the skills and understanding would be transferable.

  6. #6
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,978
    Thanks
    778
    Thanked
    1,586 times in 1,341 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by Tabbykatze View Post
    What we have to remember is that these are integrated graphics which have very difficult builds and functionality from discrete so there might not be much transferrability.
    How so? AMD use Vega cores in integrated and their top end GPUs. Nvidia seem to use the same cores in their desktop parts as the Nintendo Switch.

    Now Intel have traditionally sucked at scaling their graphics up beyond something tiny, but that's just them.

  7. #7
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    ...yea and Raja pliz give us HBM for the iGPU

  8. #8
    Senior Member
    Join Date
    Jan 2006
    Posts
    362
    Thanks
    63
    Thanked
    44 times in 30 posts
    • hb904460's system
      • Motherboard:
      • Asus A88XM-PLUS
      • CPU:
      • AMD A6-5400K
      • Memory:
      • 8gb DDR3 @ 1866mhz
      • Storage:
      • 240gb Crucial mx500 + 500gb WD Caviar Blue
      • PSU:
      • Antec NeoEco 620W
      • Case:
      • Silverstone PS07
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Viewsonic VA2037m

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    It doesn't mention the RAM speed of the systems tested in the article, which will have a significant effect on the GPU results.

  9. #9
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by DanceswithUnix View Post
    How so? AMD use Vega cores in integrated and their top end GPUs. Nvidia seem to use the same cores in their desktop parts as the Nintendo Switch.

    Now Intel have traditionally sucked at scaling their graphics up beyond something tiny, but that's just them.
    From what I understand with the Intel Iris is they are baked directly into the monolithic die are designed to specifically interlink with the CPU they are built within. However, the Vega cores (the same vega on both Intel and AMD "APUs" are carved out Vega cores designed to work with the Infinity Fabric IIRC (struggling to get the right resources) so therefore it is a separate entity that can be communicated with regardless of state?

  10. #10
    Senior Member
    Join Date
    Dec 2003
    Location
    Taichung City
    Posts
    898
    Thanks
    281
    Thanked
    172 times in 121 posts
    • mtyson's system
      • Motherboard:
      • Gigabyte GA-B85M-HD3
      • CPU:
      • Intel Core i7 4790T
      • Memory:
      • 12GB
      • Storage:
      • Sandisk 128GB SSD + Kingston 500GB SSD + NAS etc
      • Graphics card(s):
      • Sapphire Radeon RX 580 Nitro+
      • PSU:
      • Corsair 430W
      • Case:
      • Zalman Z9 Plus
      • Operating System:
      • Windows 10
      • Monitor(s):
      • AOC 31.5-inch VA QHD monitor
      • Internet:
      • 100MB Virgin fibre

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    New Intel Graphics Tweet:

    "With Intel Gen11 Graphics, we have specialized hardware for super-fast video decoding and encoding."


  11. #11
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,978
    Thanks
    778
    Thanked
    1,586 times in 1,341 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by Tabbykatze View Post
    From what I understand with the Intel Iris is they are baked directly into the monolithic die are designed to specifically interlink with the CPU they are built within. However, the Vega cores (the same vega on both Intel and AMD "APUs" are carved out Vega cores designed to work with the Infinity Fabric IIRC (struggling to get the right resources) so therefore it is a separate entity that can be communicated with regardless of state?
    I would have though that the heterogeneous computing stuff that AMD were pushing would mean their Vega cores were far more CPU aware than Intel's.

  12. #12
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by mtyson View Post
    New Intel Graphics Tweet:

    "With Intel Gen11 Graphics, we have specialized hardware for super-fast video decoding and encoding."

    Sometimes Intel marketing makes no sense, what is it for gaming or productivity?

    Quote Originally Posted by DanceswithUnix View Post
    I would have though that the heterogeneous computing stuff that AMD were pushing would mean their Vega cores were far more CPU aware than Intel's.
    I'm not sure what you mean by CPU aware, but I think you mean that they are more focused on one architecture over the other? If so, then the i7-8809G with Vega M graphics might be an answer to be CPU specific?

  13. #13
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,978
    Thanks
    778
    Thanked
    1,586 times in 1,341 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by Tabbykatze View Post
    I'm not sure what you mean by CPU aware, but I think you mean that they are more focused on one architecture over the other? If so, then the i7-8809G with Vega M graphics might be an answer to be CPU specific?
    Erm, no

    You said you thought the Intel GPUs were baked into the CPUs that they were a part of. I was merely pointing out that the AMD APUs have their GPU cores very tightly bound to the CPU giving a unified compute system with shared memory map etc, despite the fact that those same Vega cores can be found in an external GPU. Integration into the CPU ecosystem isn't an indicator of how well they would work in an external GPU.

    Basically we have no idea how well Intel's Gen11 would work if those EUs were given their own GDDR ram and plonked on a PCIe interface, but there is no reason that those functional blocks can't be used in a GPU chip.

  14. #14
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    Quote Originally Posted by DanceswithUnix View Post
    Erm, no

    You said you thought the Intel GPUs were baked into the CPUs that they were a part of. I was merely pointing out that the AMD APUs have their GPU cores very tightly bound to the CPU giving a unified compute system with shared memory map etc, despite the fact that those same Vega cores can be found in an external GPU. Integration into the CPU ecosystem isn't an indicator of how well they would work in an external GPU.

    Basically we have no idea how well Intel's Gen11 would work if those EUs were given their own GDDR ram and plonked on a PCIe interface, but there is no reason that those functional blocks can't be used in a GPU chip.
    Ahhh, I see what you mean. I believe the Intel Iris graphics were designed alongside the CPU and not as a seperate entity and they do heavily rely on CPU specific resources. However, the Vega side started as independent PCIe based discrete then was brought down to an IF (which is basically advanced PCIe) based integrated compute. To me, at least, the design considerations of both graphical systems apply in very different ways. That and we know Vega exists away from CPU but until Intel either bring their graphics into discrete or design a brand new architecture, it does not exist.

    I'm on the fence about whether CPU ecosystem isn't an indicator of how well they would work away from CPU, Vega and the integrated Intel are two very different beasts. I, frustratingly, can't find any decent articles describing the interconnect between Intel IGP and the CPU cores because I'm very confident it is not PCIe based so therefore you would have to rip out the IGP and either completely reconfigure it or create a translator between PCIe and IGP which could dramatically harm performance. It is that facet that I do not believe that the Intel graphics can be just lifted out and made discrete.

  15. #15
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

    few years back I read an article like this....do you know why intel Quick Sync is fast? because the iGPU has access to certain cpu's resources such as the cache!!

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •