Page 1 of 2 12 LastLast
Results 1 to 16 of 22

Thread: Intel teases discrete graphics card for 2020

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    31,709
    Thanks
    0
    Thanked
    2,073 times in 719 posts

    Intel teases discrete graphics card for 2020

    This teaser video is the first post on a new @IntelGraphics Twitter account.
    Read more.

  2. #2
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel teases discrete graphics card for 2020

    Looks very Vega 64 esque

  3. #3
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Intel teases discrete graphics card for 2020

    intel teases exit from discrete graphics cards (again) for 2021
    ftfy

  4. #4
    Now 100% Apple free cheesemp's Avatar
    Join Date
    Apr 2007
    Location
    Near the New forest
    Posts
    2,948
    Thanks
    354
    Thanked
    255 times in 173 posts
    • cheesemp's system
      • Motherboard:
      • ASUS TUF x570-plus
      • CPU:
      • Ryzen 3600
      • Memory:
      • 16gb Corsair RGB ram
      • Storage:
      • 256Gb NVMe + 500Gb TcSunbow SDD (cheap for games only)
      • Graphics card(s):
      • RX 480 8Gb Nitro+ OC (with auto OC to above 580 speeds!)
      • PSU:
      • Cooler Master MWE 750 bronze
      • Case:
      • Gamemax f15m
      • Operating System:
      • Win 11
      • Monitor(s):
      • 32" QHD AOC Q3279VWF
      • Internet:
      • FTTC ~35Mb

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by Tabbykatze View Post
    Looks very Vega 64 esque
    Yes I wonder if they got some 'ideas' when they implemented the AMD GPU... Certainly could have afforded to have pinched enough engineers from AMD!
    Trust

    Laptop : Dell Inspiron 1545 with Ryzen 5500u, 16gb and 256 NVMe, Windows 11.

  5. #5
    Registered+
    Join Date
    Mar 2014
    Posts
    54
    Thanks
    0
    Thanked
    2 times in 2 posts

    Re: Intel teases discrete graphics card for 2020

    Hard to see how Intel will manage to find a space in a highly competitive market which Nvidia and AMD seem to be filling.

  6. #6
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by DanceswithUnix View Post
    ftfy
    https://www.reddit.com/r/Amd/comment...the_vega_m_gl/

    LMAO,no way to report driver bugs even though Intel is in charge of drivers for the Vega M GPUs they use. They really need to put more effort in driver support.

  7. #7
    Be wary of Scan Dashers's Avatar
    Join Date
    Jun 2016
    Posts
    1,079
    Thanks
    40
    Thanked
    137 times in 107 posts
    • Dashers's system
      • Motherboard:
      • Gigabyte GA-X99-UD4
      • CPU:
      • Intel i7-5930K
      • Memory:
      • 48GB Corsair DDR4 3000 Quad-channel
      • Storage:
      • Intel 750 PCIe SSD; RAID-0 x2 Samsung 840 EVO; RAID-0 x2 WD Black; RAID-0 x2 Crucial MX500
      • Graphics card(s):
      • MSI GeForce GTX 1070 Ti
      • PSU:
      • CoolerMaster Silent Pro M2 720W
      • Case:
      • Corsair 500R
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Philips 40" 4K AMVA + 23.8" AOC 144Hz IPS
      • Internet:
      • Zen FTTC

    Re: Intel teases discrete graphics card for 2020

    Well they already do add-in coprocessors, probably not much work required to rejig these to being more output and consumer focused.

    And with nvidia adding an integer processor to their RTX chips, that's treading on CPU toes.

  8. #8
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: Intel teases discrete graphics card for 2020

    RAJA now has access to billion of dollars and worlds most advanced fabs lets guess what he will come up with.

  9. #9
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by lumireleon View Post
    RAJA now has access to billion of dollars and worlds most advanced fabs lets guess what he will come up with.
    An AMD product?

  10. Received thanks from:

    Iota (16-08-2018)

  11. #10
    Senior Member
    Join Date
    Aug 2011
    Posts
    464
    Thanks
    2
    Thanked
    30 times in 23 posts
    • Bagpuss's system
      • Motherboard:
      • Gigabyte Z390 Aorus Pro Wi-Fi
      • CPU:
      • Intel i9-9900K
      • Memory:
      • 32GB Corsair Vengeance RGB Pro DDR4 3400
      • Storage:
      • Gigabyte 512GB NVMe SSD, Crucial 1Tb NVMe SSD, 6Tb Seagate 7200
      • Graphics card(s):
      • EVGA 2080 Black Edition
      • PSU:
      • Corsair 850 RMx 850 Gold
      • Case:
      • Fractal Meshify C Copper Front Panel
      • Operating System:
      • Windows 10
      • Monitor(s):
      • LG UK850 27in 4K HDR Freesync/Gsync
      • Internet:
      • Three Mobile 4G Unlimited Data (35-45Mbit)

    Re: Intel teases discrete graphics card for 2020

    I.m sure they'll get there eventually but I doubt the 1st,2nd or 3rd generation of their cards will trouble the top end Nvidia GPU's of 2020.

  12. #11
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Gateshead
    Posts
    15,196
    Thanks
    1,231
    Thanked
    2,291 times in 1,874 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 2x 4GB DDR4 2666
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by Dashers View Post
    Well they already do add-in coprocessors, probably not much work required to rejig these to being more output and consumer focused.
    Those add in co-processors are essentially a lot of Atom cores on a single chip. They're not going to accelerate modern game engines to any meaningful extent. Far more likely that they're starting off from the cores in their IGPs and seeing if they can scale them up to large GPUs

    Quote Originally Posted by Dashers View Post
    And with nvidia adding an integer processor to their RTX chips, that's treading on CPU toes.
    Not convinced that follows. CPUs do a lot more than just a few basic integer calculations. The whole point of GPGPU is that the GPU is a simple but very wide processor that's ideal for running simple instructions over and over again. There's a definite progressive logic to adding dedicated INT processing in the same vein. I mean, theoretically you've been able to run INT calculations on a GPU since GPGPU first started - you'd just have to cast to floats and back, and you'd risk some inaccurate results due to precision and rounding. Adding native INT cores to a massively parallel processor just makes those calculations quicker and more reliable. It doesn't change the principle that you're doing simple parallel tasks repeatedly, rather than interpreting complex branch logic...

  13. #12
    Senior Member
    Join Date
    May 2009
    Location
    Where you are not
    Posts
    1,330
    Thanks
    608
    Thanked
    103 times in 90 posts
    • Iota's system
      • Motherboard:
      • Asus Maximus Hero XI
      • CPU:
      • Intel Core i9 9900KF
      • Memory:
      • CMD32GX4M2C3200C16
      • Storage:
      • 1 x 1TB / 3 x 2TB Samsung 970 Evo Plus NVMe
      • Graphics card(s):
      • Nvidia RTX 3090 Founders Edition
      • PSU:
      • Corsair HX1200i
      • Case:
      • Corsair Obsidian 500D
      • Operating System:
      • Windows 10 Pro 64-bit
      • Monitor(s):
      • Samsung Odyssey G9
      • Internet:
      • 500Mbps BT FTTH

    Re: Intel teases discrete graphics card for 2020

    Intel is being squeezed from multiple angles with regard to its CPU output; Arm looks to have won the mobile war (and via Qualcomm is pushing into Cellular PCs), and AMD is competing very fiercely for consumer PCs and the lucrative x86 workstation and server market.
    It would make more sense for them to keep focus on AI specific products, but I guess the offshoot of that is to start treading into both Nvidia and AMD areas in terms of multiple gpu cores for other markets. I just don't see the sense in them committing to discreet desktop graphics cards, especially as they'll also have to do a lot of legwork for drivers and building up that platform, as well as additional resources for developers etc etc.

    Certainly they could, but it's a big risk when there are already two well established players in the market. Be cheaper for them to buy Nvidia

  14. #13
    Old Geezer
    Join Date
    Jul 2016
    Location
    Under a rusty bucket
    Posts
    540
    Thanks
    53
    Thanked
    42 times in 31 posts

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by Tabbykatze View Post
    Looks very Vega 64 esque
    All that video and picture tells us, is that it may be black, you can't even see how many fans it has.

  15. #14
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by Iota View Post
    especially as they'll also have to do a lot of legwork for drivers and building up that platform, as well as additional resources for developers etc etc.

    Certainly they could, but it's a big risk when there are already two well established players in the market. Be cheaper for them to buy Nvidia
    But aren't Intel already the top graphics provider by volume? It's only discrete cards that they lag on, but I don't know why drivers for discrete cards would be that much harder or resource taking than for chips.

  16. #15
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by kalniel View Post
    But aren't Intel already the top graphics provider by volume? It's only discrete cards that they lag on, but I don't know why drivers for discrete cards would be that much harder or resource taking than for chips.
    Someone did a really good explanation why this isn't relatable but I'm struggling to find it so I'll paraphrase what I remember. They used the terminology that the on die GPU shares a lot of the co resources as the CPU as well as it not connected in conventional means. So therefore just carving it off and scaling it up won't really work because that GPU silicon running on the CPU is designed to be scaled down and within an inch of its life for max performance. It would be an interesting precedent if a product that was designed to be small and sip power bolted to a CPU could be detached and scaled up to a dedicated card territory.

    From what I understand, the design used in the Intel HD on chip GPUs just aren't suitable for breaking out and scaling up, it just wasn't designed with that in mind.

  17. #16
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Intel teases discrete graphics card for 2020

    Quote Originally Posted by Tabbykatze View Post
    It would be an interesting precedent if a product that was designed to be small and sip power bolted to a CPU could be detached and scaled up to a dedicated card territory.
    Nvidia created a graphics unit that they could put into Tegra mobile & tablet chips, and also use it in their desktop cards to great effect. So it can be done, but the difference here is that Nvidia are good at doing graphics whereas I doubt Intel can pull off the same trick.

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •