Page 2 of 2 FirstFirst 12
Results 17 to 23 of 23

Thread: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

  1. #17
    Senior Member
    Join Date
    Aug 2006
    Posts
    2,207
    Thanks
    15
    Thanked
    114 times in 102 posts

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    Quote Originally Posted by cheesemp View Post
    Personally it could be a cheaper card and I'd still not be interested until I could see what Intel's driver long term support is like. This is why I really like AMD - my aging rx480 still gets driver updates that improve performance. It might be slight but it there. Nvidia aren't as good in this regard but at least they release regular title specific + stability driver updates for years. Having suffered Intel drivers failure to fix bugs after about a year after the product is released I'm just not going to risk it just yet.

    I think my hope is the card is good for crypto and the miners buy it while leaving AMD cards for me
    Not saying I'd buy it either but not everyone is tech orientated like those of us on this forum.

    I'm basically stuck in the nvidia camp due to the software I use only supporting cuda

  2. #18
    Member
    Join Date
    Nov 2018
    Posts
    113
    Thanks
    0
    Thanked
    1 time in 1 post

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    "If Intel can get the level of performance we are seeing hinted at, ample supplies out, and its pricing right, we could be looking at a much more interesting three horse GPU race in H2 this year." Too late for that.

  3. #19
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    Quote Originally Posted by [GSV]Trig View Post
    Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
    Not so much testing as released figures on dimensions, I've not got the time to dbl check but i think TSMC's 10nm was roughly comparable to Intel's 14nm whereas their 7nm is a bit smaller in most key areas.

    Comparing node sizes between manufactures isn't very useful though due to the number of variables.

  4. #20
    trillo_del_diavolo
    Guest

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    This "nm" thing is mostly a marketing term nowadays, as they pick the smallest feature and try to make it like it's the whole transistor at that size, to entice laypeople and the bean counters.

    As for the Xe whatever GPU: I'll believe it when I see it, but I'm already expecting a huge flop if it ever gets released.

  5. #21
    Senior Member
    Join Date
    Feb 2019
    Posts
    224
    Thanks
    0
    Thanked
    2 times in 2 posts

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    It is possible that in 10 years there will be quantum processors.Moor's law is not working 100%, progress is getting slower, there will definitely be something new.

  6. #22
    Senior Member
    Join Date
    Nov 2005
    Posts
    434
    Thanks
    32
    Thanked
    15 times in 14 posts

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    The conjecture and estimates are all fine in my eyes. But it will still be irrelevant (sadly) for me as a buyer, (just like all AMD cards too) compared to nvidia. And I say that as an AMD fanboy.

    That's because nvidia cards have DLSS (and tensor cores) which, 3D rendering power aside, gives them advantages in ray tracing and, more importantly, what DLSS had do for framerates/resolution (or both).

    I know that some games seem to favour generic DX raytracing which AMD competes quite well with. But the DLSS is the dealbreaker. The fact that nvidia cards can use AI to make games like Cyberpunk 2077 playable at 4K is totally insane. Yes they are rendering at 1080p or 1440p but there are enough videos showing that the fidelity of the AI "upscaling" (although not actually upscaling) is near indistinguishable.

    So if I were buying today (for the insane cost these cards are!), it would have to be nvidia, no question.

    I just sure hope AMD had get their implementation of DLSS out quickly to compete...

    What *is* impressive about Intel Xe is just how fast Intel has closed the gap. Kudos to them. I really hope we have even 3 way competition in the GPU market in the near future.

  7. #23
    RIP Peterb ik9000's Avatar
    Join Date
    Nov 2009
    Posts
    7,415
    Thanks
    1,708
    Thanked
    1,301 times in 971 posts
    • ik9000's system
      • Motherboard:
      • Asus P7H55-M/USB3
      • CPU:
      • i7-870, Prolimatech Megahalems, 2x Akasa Apache 120mm
      • Memory:
      • 4x4GB Corsair Vengeance 2133 11-11-11-27
      • Storage:
      • 2x256GB Samsung 840-Pro, 1TB Seagate 7200.12, 1TB Seagate ES.2
      • Graphics card(s):
      • Gigabyte GTX 460 1GB SuperOverClocked
      • PSU:
      • NZXT Hale 90 750w
      • Case:
      • BitFenix Survivor + Bitfenix spectre LED fans, LG BluRay R/W optical drive
      • Operating System:
      • Windows 7 Professional
      • Monitor(s):
      • Dell U2414h, U2311h 1920x1080
      • Internet:
      • 200Mb/s Fibre and 4G wifi

    Re: Raja Koduri pictured testing Xe HPG GPU in Intel's Folsom Lab

    Quote Originally Posted by Noli View Post
    What *is* impressive about Intel Xe is just how fast Intel has closed the gap. Kudos to them. I really hope we have even 3 way competition in the GPU market in the near future.
    Larrabee has been overdue since 2011, not sure that this is that impressive. And it seems to me they're only pushing this due to stagnation elsewhere.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •