Page 1 of 2 12 LastLast
Results 1 to 16 of 18

Thread: Nvidia touts big advances in its DLSS 2.0 technology

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    29,542
    Thanks
    0
    Thanked
    1,928 times in 670 posts

    Nvidia touts big advances in its DLSS 2.0 technology

    DLSS 2.0 trains using non-game-specific content. Feature already supported in 4 games.
    Read more.

  2. #2
    Senior Member
    Join Date
    Apr 2016
    Posts
    294
    Thanks
    0
    Thanked
    3 times in 3 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/

  3. #3
    "You're my wife now!" Ravens Nest's Avatar
    Join Date
    Jul 2003
    Location
    The Pandemonium Carnival
    Posts
    1,568
    Thanks
    26
    Thanked
    49 times in 29 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    I wouldn't put it past them!
    "Mmm... I want you for my wife!"
    "Autom...Sprow...Canna...Tik banna...Sandwol...But no sera smee?"
    "Of course you can. We would love for you to join us."

  4. #4
    Senior Member
    Join Date
    Sep 2014
    Posts
    244
    Thanks
    0
    Thanked
    7 times in 7 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    Rather not - they just didn't see business value in polishing their initial awesome feature xD.
    Now that Radeon did that without AI and better they try to do the thing complete.

    I still wonder how it compares to the normal resolution scaling performance/quality wise.

  5. #5
    Headless Chicken Terbinator's Avatar
    Join Date
    Apr 2009
    Posts
    7,357
    Thanks
    1,108
    Thanked
    680 times in 557 posts
    • Terbinator's system
      • Motherboard:
      • ASRock H61M
      • CPU:
      • Intel Xeon 1230-V3
      • Memory:
      • Geil Evo Corsa 2133/8GB
      • Storage:
      • M4 128GB, 2TB WD Red
      • Graphics card(s):
      • Gigabyte GTX Titan
      • PSU:
      • Corsair AX760i
      • Case:
      • Coolermaster 130
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • Dell Ultrasharp U2711H
      • Internet:
      • Virgin Media 60Mb.

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by DevDrake View Post
    Rather not - they just didn't see business value in polishing their initial awesome feature xD.
    Now that Radeon did that without AI and better they try to do the thing complete.

    I still wonder how it compares to the normal resolution scaling performance/quality wise.
    I've not managed to keep on top of tech news properly as I've not had proper access to a computer, but the AMD stuff is intrinsically linked to DirectX/Xbox, isn't it?

    The XSX/AMD demos with AI reconstruction used dedicted 4 and 8 bit hardware for calculations.

    https://www.eurogamer.net/articles/d...s-x-full-specs
    Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
    CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
    TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
    for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.

  6. #6
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    931
    Thanks
    1
    Thanked
    23 times in 21 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    so whats AMD answer to DLSS?

  7. #7
    007
    007 is offline
    Member
    Join Date
    Apr 2007
    Location
    GLASGOW
    Posts
    107
    Thanks
    28
    Thanked
    8 times in 7 posts
    • 007's system
      • Motherboard:
      • MSI x370 Gaming Plus
      • CPU:
      • Ryzen 5 2600
      • Memory:
      • G-Skill 3000 CL16
      • Storage:
      • SAMSUNG 850 Pro
      • Graphics card(s):
      • Evga gtx 1060 3GB
      • PSU:
      • OCZ SXS II 600W
      • Case:
      • NZXT S340
      • Operating System:
      • WIN 10 Pro
      • Monitor(s):
      • Iiyama 2492
      • Internet:
      • BT

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by lumireleon View Post
    so whats AMD answer to DLSS?
    GPU scaling + RIS ( or Radeon Image Sharpening)

  8. #8
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    29,373
    Thanks
    1,559
    Thanked
    2,967 times in 2,407 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte X58A UD3R rev 2
      • CPU:
      • Intel Xeon X5680
      • Memory:
      • 12gb DDR3 2000
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell U2311H
      • Internet:
      • O2 8mbps

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by QuorTek View Post
    so Nvidia has been holding back performance or?!?.. somehow I doubt that they did not know this from the beginning :/
    It's not about holding back performance, this is a new development that improves on what they previously released. It's like when AMD release a driver that improves performance, we don't suddenly claim they were holding back before.

    It does at least confirm from the horses mouth that DLSS hampers image quality

  9. #9
    Headless Chicken Terbinator's Avatar
    Join Date
    Apr 2009
    Posts
    7,357
    Thanks
    1,108
    Thanked
    680 times in 557 posts
    • Terbinator's system
      • Motherboard:
      • ASRock H61M
      • CPU:
      • Intel Xeon 1230-V3
      • Memory:
      • Geil Evo Corsa 2133/8GB
      • Storage:
      • M4 128GB, 2TB WD Red
      • Graphics card(s):
      • Gigabyte GTX Titan
      • PSU:
      • Corsair AX760i
      • Case:
      • Coolermaster 130
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • Dell Ultrasharp U2711H
      • Internet:
      • Virgin Media 60Mb.

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by 007 View Post
    GPU scaling + RIS ( or Radeon Image Sharpening)
    Actually very different approaches. GPU scaling and RIS is literally that - res/image scaling and a sharpening features - you've been able to do this even without AMD's own tools for years.

    DLSS is using ML to guess what the image should look like from a lower res source. Here is an older look MS' approach to 'DLSS' using AMD H/W - https://youtu.be/QjQm_wNrvVw?t=1477 - bear in mind this is from last GDC and so hardware predates even the 5700-series of GPUs.
    Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
    CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
    TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
    for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.

  10. #10
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    29,373
    Thanks
    1,559
    Thanked
    2,967 times in 2,407 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte X58A UD3R rev 2
      • CPU:
      • Intel Xeon X5680
      • Memory:
      • 12gb DDR3 2000
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell U2311H
      • Internet:
      • O2 8mbps

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Different approach but the same intention - upscale from a lower resolution image and sharpen. DLSS is one approach to upscale, AMD use another non-ML approach to upscale.

  11. #11
    Moosen CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    28,932
    Thanks
    3,217
    Thanked
    4,497 times in 3,470 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    The original machine learning approach has potential to work better than the AMD approach as it can be optimised better per game,but the problem you need to the train the network sufficiently to a good degree of confidence...which takes time. So the problem is if it takes months after a game has launched to get the best iteration of the upscaling algorithm,its not going to be ideal.

    Also the whole "trained using non-game specific content" is more of a kludge really,as it is saying its moving towards a more general purpose upscaling algorithm,which to a degree is what existing upscaling algorithms are,ie,not specific to any one game.


    Those despicable Elk,stealing the pond weed!

  12. #12
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    931
    Thanks
    1
    Thanked
    23 times in 21 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    AMD will use INT 16, 8, 4 of their shaders for ML workloads unlike Nvidia who use dedicated Tensor cores. In other words AMD does not brag of Tensor cores.

  13. #13
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    29,373
    Thanks
    1,559
    Thanked
    2,967 times in 2,407 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte X58A UD3R rev 2
      • CPU:
      • Intel Xeon X5680
      • Memory:
      • 12gb DDR3 2000
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell U2311H
      • Internet:
      • O2 8mbps

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Also the whole "trained using non-game specific content" is more of a kludge really,as it is saying its moving towards a more general purpose upscaling algorithm,which to a degree is what existing upscaling algorithms are,ie,not specific to any one game.
    True, though I think the potential is in finding sampling patterns that are different to human-designed ones. It could be that (over-simplifying) for a scene that has a lot of sky then the sky bits use some specific upscaler, while ground textures use another - or more likely the ML comes up with some strange correlation like there was a duck in the corner.

  14. #14
    Registered+
    Join Date
    Oct 2017
    Posts
    64
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by lumireleon View Post
    so whats AMD answer to DLSS?
    It used to be software simulated RT (probably currently), which does not affect fps. But that's changing with the implementation of hardware based RT. However it would be handy, if they could fix their driver issues. And that's coming from a fanboy.

  15. #15
    Moosen CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    28,932
    Thanks
    3,217
    Thanked
    4,497 times in 3,470 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by kalniel View Post
    True, though I think the potential is in finding sampling patterns that are different to human-designed ones. It could be that (over-simplifying) for a scene that has a lot of sky then the sky bits use some specific upscaler, while ground textures use another - or more likely the ML comes up with some strange correlation like there was a duck in the corner.
    Well we don't know whether previous upscaling methods used on consoles used machine learning to optimise the general purpose upscaling algorithms - Microsoft does have a lot of investments in this area too. From what I also gatherng is DLSS2.0 is using a lot of sharpening to make the image look "better"which sounds a bit like AMD is doing too,but in more general way.


    Those despicable Elk,stealing the pond weed!

  16. #16
    Registered+
    Join Date
    Apr 2019
    Posts
    17
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: Nvidia touts big advances in its DLSS 2.0 technology

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Well we don't know whether previous upscaling methods used on consoles used machine learning to optimise the general purpose upscaling algorithms - Microsoft does have a lot of investments in this area too. From what I also gatherng is DLSS2.0 is using a lot of sharpening to make the image look "better"which sounds a bit like AMD is doing too,but in more general way.
    The result of the learning is a computationally expensive algorithm that would be too expensive to run without specialised hardware (the tensor cores). That much more complex algorithm can just do a better job then the simpler ones available to gpu's without specialised AI processing.

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •