Page 1 of 2 12 LastLast
Results 1 to 16 of 18

Thread: Nvidia touts big advances in its DLSS 2.0 technology

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    31,709
    Thanks
    0
    Thanked
    2,073 times in 719 posts

    Nvidia touts big advances in its DLSS 2.0 technology

    DLSS 2.0 trains using non-game-specific content. Feature already supported in 4 games.
    Read more.

  2. #2
    Senior Member
    Join Date
    Apr 2016
    Posts
    772
    Thanks
    0
    Thanked
    9 times in 9 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/

  3. #3
    Ravens Nest
    Guest

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    I wouldn't put it past them!

  4. #4
    Senior Member
    Join Date
    Sep 2014
    Posts
    400
    Thanks
    0
    Thanked
    9 times in 9 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    Rather not - they just didn't see business value in polishing their initial awesome feature xD.
    Now that Radeon did that without AI and better they try to do the thing complete.

    I still wonder how it compares to the normal resolution scaling performance/quality wise.

  5. #5
    Headless Chicken Terbinator's Avatar
    Join Date
    Apr 2009
    Posts
    7,670
    Thanks
    1,210
    Thanked
    727 times in 595 posts
    • Terbinator's system
      • Motherboard:
      • ASRock H61M
      • CPU:
      • Intel Xeon 1230-V3
      • Memory:
      • Geil Evo Corsa 2133/8GB
      • Storage:
      • M4 128GB, 2TB WD Red
      • Graphics card(s):
      • Gigabyte GTX Titan
      • PSU:
      • Corsair AX760i
      • Case:
      • Coolermaster 130
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • Dell Ultrasharp U2711H
      • Internet:
      • Virgin Media 60Mb.

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by DevDrake View Post
    Rather not - they just didn't see business value in polishing their initial awesome feature xD.
    Now that Radeon did that without AI and better they try to do the thing complete.

    I still wonder how it compares to the normal resolution scaling performance/quality wise.
    I've not managed to keep on top of tech news properly as I've not had proper access to a computer, but the AMD stuff is intrinsically linked to DirectX/Xbox, isn't it?

    The XSX/AMD demos with AI reconstruction used dedicted 4 and 8 bit hardware for calculations.

    https://www.eurogamer.net/articles/d...s-x-full-specs
    Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
    CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
    TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
    for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.

  6. #6
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    so whats AMD answer to DLSS?

  7. #7
    007
    007 is offline
    Member
    Join Date
    Apr 2007
    Location
    GLASGOW
    Posts
    110
    Thanks
    31
    Thanked
    8 times in 7 posts
    • 007's system
      • Motherboard:
      • MSI x370 Gaming Plus
      • CPU:
      • Ryzen 5 2600
      • Memory:
      • G-Skill 3000 CL16
      • Storage:
      • SAMSUNG 850 Pro
      • Graphics card(s):
      • MSI air boost Vega 56
      • PSU:
      • Corsair RM650X
      • Case:
      • NZXT S340
      • Operating System:
      • WIN 10 Pro
      • Monitor(s):
      • Iiyama 2492
      • Internet:
      • Change the isp regularly

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by lumireleon View Post
    so whats AMD answer to DLSS?
    GPU scaling + RIS ( or Radeon Image Sharpening)

  8. #8
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,023
    Thanks
    1,870
    Thanked
    3,381 times in 2,718 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    It's not about holding back performance, this is a new development that improves on what they previously released. It's like when AMD release a driver that improves performance, we don't suddenly claim they were holding back before.

    It does at least confirm from the horses mouth that DLSS hampers image quality

  9. #9
    Headless Chicken Terbinator's Avatar
    Join Date
    Apr 2009
    Posts
    7,670
    Thanks
    1,210
    Thanked
    727 times in 595 posts
    • Terbinator's system
      • Motherboard:
      • ASRock H61M
      • CPU:
      • Intel Xeon 1230-V3
      • Memory:
      • Geil Evo Corsa 2133/8GB
      • Storage:
      • M4 128GB, 2TB WD Red
      • Graphics card(s):
      • Gigabyte GTX Titan
      • PSU:
      • Corsair AX760i
      • Case:
      • Coolermaster 130
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • Dell Ultrasharp U2711H
      • Internet:
      • Virgin Media 60Mb.

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by 007 View Post
    GPU scaling + RIS ( or Radeon Image Sharpening)
    Actually very different approaches. GPU scaling and RIS is literally that - res/image scaling and a sharpening features - you've been able to do this even without AMD's own tools for years.

    DLSS is using ML to guess what the image should look like from a lower res source. Here is an older look MS' approach to 'DLSS' using AMD H/W - https://youtu.be/QjQm_wNrvVw?t=1477 - bear in mind this is from last GDC and so hardware predates even the 5700-series of GPUs.
    Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
    CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
    TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
    for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.

  10. #10
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,023
    Thanks
    1,870
    Thanked
    3,381 times in 2,718 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Different approach but the same intention - upscale from a lower resolution image and sharpen. DLSS is one approach to upscale, AMD use another non-ML approach to upscale.

  11. #11
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    The original machine learning approach has potential to work better than the AMD approach as it can be optimised better per game,but the problem you need to the train the network sufficiently to a good degree of confidence...which takes time. So the problem is if it takes months after a game has launched to get the best iteration of the upscaling algorithm,its not going to be ideal.

    Also the whole "trained using non-game specific content" is more of a kludge really,as it is saying its moving towards a more general purpose upscaling algorithm,which to a degree is what existing upscaling algorithms are,ie,not specific to any one game.

  12. #12
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    AMD will use INT 16, 8, 4 of their shaders for ML workloads unlike Nvidia who use dedicated Tensor cores. In other words AMD does not brag of Tensor cores.

  13. #13
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,023
    Thanks
    1,870
    Thanked
    3,381 times in 2,718 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Also the whole "trained using non-game specific content" is more of a kludge really,as it is saying its moving towards a more general purpose upscaling algorithm,which to a degree is what existing upscaling algorithms are,ie,not specific to any one game.
    True, though I think the potential is in finding sampling patterns that are different to human-designed ones. It could be that (over-simplifying) for a scene that has a lot of sky then the sky bits use some specific upscaler, while ground textures use another - or more likely the ML comes up with some strange correlation like there was a duck in the corner.

  14. #14
    Registered+
    Join Date
    Oct 2017
    Posts
    89
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by lumireleon View Post
    so whats AMD answer to DLSS?
    It used to be software simulated RT (probably currently), which does not affect fps. But that's changing with the implementation of hardware based RT. However it would be handy, if they could fix their driver issues. And that's coming from a fanboy.

  15. #15
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by kalniel View Post
    True, though I think the potential is in finding sampling patterns that are different to human-designed ones. It could be that (over-simplifying) for a scene that has a lot of sky then the sky bits use some specific upscaler, while ground textures use another - or more likely the ML comes up with some strange correlation like there was a duck in the corner.
    Well we don't know whether previous upscaling methods used on consoles used machine learning to optimise the general purpose upscaling algorithms - Microsoft does have a lot of investments in this area too. From what I also gatherng is DLSS2.0 is using a lot of sharpening to make the image look "better"which sounds a bit like AMD is doing too,but in more general way.

  16. #16
    Member
    Join Date
    Apr 2019
    Posts
    104
    Thanks
    1
    Thanked
    1 time in 1 post

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Well we don't know whether previous upscaling methods used on consoles used machine learning to optimise the general purpose upscaling algorithms - Microsoft does have a lot of investments in this area too. From what I also gatherng is DLSS2.0 is using a lot of sharpening to make the image look "better"which sounds a bit like AMD is doing too,but in more general way.
    The result of the learning is a computationally expensive algorithm that would be too expensive to run without specialised hardware (the tensor cores). That much more complex algorithm can just do a better job then the simpler ones available to gpu's without specialised AI processing.

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •