Page 3 of 4 FirstFirst 1234 LastLast
Results 33 to 48 of 53

Thread: AMD Radeon RX 6900 XT

  1. #33
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 6900 XT

    TAA is a low performance impact method developed for consoles,but it adds a blurriness to the image. DLSS looks to be performing some sort of edge detection and local contrast enhancement/sharpening. Hence,it "looks" better,but in reality is it better than an image with quality orientated AA method such as MSAA?? I doubt it. Also another issue is machine learning upscaling methods also have issues in motion too. So its all fine and dandy looking at still images,but what about pseudo-random particle effects? Even DF had to admit despite their overenthusiasm for DLSS(and some are now doubting their objectivity a bit now,after their sponsored article),that motion artefacts were visible. Its the same phenomenon with digital cameras. Smartphones use a ton of noise reduction,but also bump up sharpening,colours and contrast. ILC images tend to use less noise reduction,to preserve actual detail,and have more muted contrast and colour,closer to the actual scene and less sharpening.

    In fact you can do a test yourself - upscale an image in PS,etc and add lots of USM,contrast enhancements,etc and then compare to a native ILC image. I suspect many people woul think the former appears better,but it isn't really.

    If you go back far enough BOTH ATI and Nvidia were doing things to quietly reduce image quality in ways which were not easily noticeable,but were called out for doing it. This is again just Nvidia doing an Apple and making claims which are dodgy,ie,"better than native" is a claim Steve Jobs would make.

    What I find even more hilarious was PCMR calling out the previous generation of consoles,for not being able to do native 4K,etc and having to use upscaling methods. Now PCMR has flip-flopped and thinks upscaling is the best thing since sliced bread,despite the fact its hiding how poor these GPUs are at RT,and how they are overselling a feature on severely underpowered GPUs. Its clever marketing - market a weakness as a strength. One has to wonder whether Nvidia could have not wasted die area on the tensor cores,and just incorporated more RT cores/shaders.

    However,this would defeat the objective of selling these GPUs into non-gaming markets.
    Last edited by CAT-THE-FIFTH; 09-12-2020 at 12:52 PM.

  2. Received thanks from:

    Tabbykatze (09-12-2020),Zhaoman (09-12-2020)

  3. #34
    Senior Member
    Join Date
    May 2009
    Location
    Where you are not
    Posts
    1,330
    Thanks
    608
    Thanked
    103 times in 90 posts
    • Iota's system
      • Motherboard:
      • Asus Maximus Hero XI
      • CPU:
      • Intel Core i9 9900KF
      • Memory:
      • CMD32GX4M2C3200C16
      • Storage:
      • 1 x 1TB / 3 x 2TB Samsung 970 Evo Plus NVMe
      • Graphics card(s):
      • Nvidia RTX 3090 Founders Edition
      • PSU:
      • Corsair HX1200i
      • Case:
      • Corsair Obsidian 500D
      • Operating System:
      • Windows 10 Pro 64-bit
      • Monitor(s):
      • Samsung Odyssey G9
      • Internet:
      • 500Mbps BT FTTH

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by ilh View Post
    Once it's in the case I never look at it again. It's job is to produce pretty, not be pretty.
    Why not both....?

    Quote Originally Posted by kalniel View Post
    I can't help but feel we've already hit most of the rasterised performance we need for a while
    I'd accept that as a premise assuming that the cards produced the frame rates high enough to match the upper end of monitor refresh rates running at 4K. Ideally with a much lower power envelope than the ones we're seeing from both AMD and Nvidia, we've had generations we're they've raced to the lower end of TBP, it's now crept up quite alarmingly. I really hope RDNA3/4 architecture aims for a more refined approach akin to Ryzen chiplet design, something scalable with good performance and a power envelope that is acceptable.

    Adding DXR into the mix has thrown things a little, AMD are playing a bit of catch up with Nvidia here, but I'm hopeful they'll keep momentum and we'll see improvements both in hardware and software (that second part is going to be key to compete). Regardless of how people feel about DLSS, I'd probably make the comparison to the previous progressive improvements we saw with the multitude of different AA techniques deployed by both Nvidia and AMD. AMD just need to get FidelityFX Super Resolution released so they can compete properly in this area, not having a software response available yet isn't really helping them.

  4. #35
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by CAT-THE-FIFTH View Post
    TAA is a low performance impact method developed for consoles,but it adds a blurriness to the image. DLSS looks to be performing some sort of edge detection and local contrast enhancement/sharpening.
    I don't see why it would have to apply such old-skool techniques. Basically you are playing a deep-fake of the game, but as long as it is a good deep-fake then that's quite impressive.

    and yes phones are using similar techniques. Take picture, apply neural net to remove noise and effectively apply some foundation to remove those annoying pimples and make your skin look smooth

  5. #36
    Headless Chicken Terbinator's Avatar
    Join Date
    Apr 2009
    Posts
    7,670
    Thanks
    1,210
    Thanked
    727 times in 595 posts
    • Terbinator's system
      • Motherboard:
      • ASRock H61M
      • CPU:
      • Intel Xeon 1230-V3
      • Memory:
      • Geil Evo Corsa 2133/8GB
      • Storage:
      • M4 128GB, 2TB WD Red
      • Graphics card(s):
      • Gigabyte GTX Titan
      • PSU:
      • Corsair AX760i
      • Case:
      • Coolermaster 130
      • Operating System:
      • Windows 8.1 Pro
      • Monitor(s):
      • Dell Ultrasharp U2711H
      • Internet:
      • Virgin Media 60Mb.

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by kompukare View Post
    But what about those who don't buy all the hype of DLSS 2.0?
    That Eurogamer article Death Stranding PC: how next-gen AI upscaling beats native 4K sounds impressive but they are comparing 1440P DLSS upscaled against native 4K with TAA. Not against native 4K without AA, or 4K with SMAA.

    So it is possible that DLSS just looks more pleasing to the eye than TAA.[/B].

    Point being: when using RT with DLSS 2.0 I wouldn't consider the resolution to be actually 4K or 8K any more, but would almost call them fake4K and fake8K.
    Whilst I agree with you, what we consider fake or not and/or paramount to IQ isn't important and quite frankly isn't considered by the developers. DLSS only work with TAA so generally is only used in comparison to Native + AA because it isn't an on/off setting. Games and game engines are being designed with TAA as a core part of their pipeline and that is only likely to increase going forward.

    Whilst I can't speak for everyone, the GPU and rendering developers I do follow on Twitter sing the praises TAA and its effect on the rendering pipeline.

    Aside from this conversation, effects in games aren't rendered at full res a lot of the time anyway - it is always a cost/quality trade-off and it is clear the temporal/deep-learning solution is where everyone is going.
    Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
    CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
    TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
    for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.

  6. #37
    Registered+
    Join Date
    Mar 2009
    Posts
    70
    Thanks
    3
    Thanked
    1 time in 1 post

    Re: AMD Radeon RX 6900 XT

    If you can hear coil whine, then there's no "potentially" involved in "potentially causing a high pitched sound"!

    Also, cutting back on performance to reduce coil whine defeats the object of a high performance card! I suppose one COULD argue that you only need to use it at the max when you're playing a game with headphones on, in which case the noise won't matter. But that seems like a cop-out for so many reasons.

    I wonder how big an issue this actually is? It would make me pause on pulling the trigger on that card.

  7. #38
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by michaelg View Post
    If you can hear coil whine, then there's no "potentially" involved in "potentially causing a high pitched sound"!

    Also, cutting back on performance to reduce coil whine defeats the object of a high performance card! I suppose one COULD argue that you only need to use it at the max when you're playing a game with headphones on, in which case the noise won't matter. But that seems like a cop-out for so many reasons.

    I wonder how big an issue this actually is? It would make me pause on pulling the trigger on that card.
    They're not suggesting cutting back on performance, just adding a framecap so the card isn't sitting there rendering 800fps on a menu for eg.. Putting a frame cap to say 200fps isn't going to cut back performance.

  8. #39
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by DanceswithUnix View Post
    I don't see why it would have to apply such old-skool techniques. Basically you are playing a deep-fake of the game, but as long as it is a good deep-fake then that's quite impressive.

    and yes phones are using similar techniques. Take picture, apply neural net to remove noise and effectively apply some foundation to remove those annoying pimples and make your skin look smooth
    DLSS2.0 is not the per game upscaling of DLSS1.0,its a generalised upscaling,and its far easier and cheaper to do old school techniques to boost the image. What do you think "sharpening" is?? The concept never existed in film photography,because its is local contrast enhancement and human vision is very perceptive of such gradients. Sharpness in the film days was about detail resolved,and was a lines per mm test,and lens sharpness was about how well a lens could replicate a reference image without optical issues. Digital sharpness is none of that.

    Except they will never tell you exactly what is happening,so they can sneak these things in. I suspect a lot of these phones,etc are doing such things somewhere in the image processing pipeline - after all now you phones can literally run something like Lightroom,so its probably quite easy to automate a set of presets which can be auto-applied on a per scene basis.

    In fact,even 20 years ago,cameras were able to light meter dependent on different scenes. The Nikon F5,had 1000s of already stored scene presets,which it could already auto apply if it a scene type was recognised.

    Even with phones its more of the same,lots of sharpening,contrast enhancement,etc and a sea of areas in between the edges with no real detail. If you have ever done noise reduction manually,you can see what they are doing. When you have had to work with image processing in a previous job,and generally count photography as a hobby,it makes me wonder how so many of the tech media appear to have very poor eyesight. I remember,one so called photography site,who basically took the worse 35mm frame dSLR they could find with an ancient sensor,shut off all internal image processing,used a rubbish kit lens,which they shot at a very narrow aperture,switched off IS,and bumped up the ISO,etc and shot handheld. Then took a picture with a smartphone,with NR on,wide open aperture, IS on,multi-shot averaging on,attached to a tripod,etc and proclaimed how brilliant it was. Except IIRC,the dSLR image still actually had more detail despite the oodles of noise.

    Essentially all this "AI" stuff in SOCs,is just marketing speak for coprocessors which do certain specialised operations well. If anything I remember even years ago,I knew some people who worked on machine learning(in imaging) and they really got annoyed everytime people kept calling it AI,or some silly name.

    What I have an issue is Nvidia and its "better than native" rubbish claims,which is trying to market their way out of their GPUs being mediocre at RT,and having to drop image quality to have reasonable framerates.

    Which is exactly the same stuff ATI and Nvidia did years ago to boost their performance. It's just Apple level, Steve Jobs proud kind of marketing.

    Instead of saying,we have solid upscaling to improve FPS and there is a reduction in image quality,now our magical technique makes the image better than what the devs of the game actually made it.

    Its like saying 256KB MP3 audio stream of a live performance,is better than being at the actual live performance. But,but our reference happens to be the live TV broadcast,but its still better than the live performance we are streaming!
    Last edited by CAT-THE-FIFTH; 09-12-2020 at 07:09 PM.

  9. #40
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by CAT-THE-FIFTH View Post
    DLSS2.0 is not the per game upscaling of DLSS1.0,its a generalised upscaling
    I'd forgotten DLSS 2.0 used more generalised training, sadly implementation is still per-game.

    Essentially all this "AI" stuff in SOCs,is just marketing speak for coprocessors which do certain specialised operations well. If anything I remember even years ago,I knew some people who worked on machine learning(in imaging) and they really got annoyed everytime people kept calling it AI,or some silly name.
    *shrugs* I've been in the field several decades and it doesn't bother me. They are neural networks, which is one of the methods of performing machine learning. When applied to making decisions about an outcome you can call them classifiers or in more general terms anything which makes decisions which seem intelligent is AI. So AI applies just as well to decision trees, fuzzy logic, or any manner of things that simulate intelligence - games have been doing AI almost since they started being made.

    In terms of DLSS or image enhancements in phones it's just some kind of rules based classifier (is this pixel some noise to be removed or is it part of texture to be kept) where the rules are encoded in a NN and thus able to capture more complex nuances than hard rule coding. If you think it's clever then it's fine to call it AI.

  10. #41
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by kalniel View Post
    I'd forgotten DLSS 2.0 used more generalised training, sadly implementation is still per-game.

    *shrugs* I've been in the field several decades and it doesn't bother me. They are neural networks, which is one of the methods of performing machine learning. When applied to making decisions about an outcome you can call them classifiers or in more general terms anything which makes decisions which seem intelligent is AI. So AI applies just as well to decision trees, fuzzy logic, or any manner of things that simulate intelligence - games have been doing AI almost since they started being made.

    In terms of DLSS or image enhancements in phones it's just some kind of rules based classifier (is this pixel some noise to be removed or is it part of texture to be kept) where the rules are encoded in a NN and thus able to capture more complex nuances than hard rule coding. If you think it's clever then it's fine to call it AI.
    Its like saying a wheel is part of a car,hence a wheel should be called a car. Machine learning is itself a subset of the AI field,but cannot be defined as AI on its own. It's an easy term to use out there,but like a lot of buzzwords its being overused for everything.

    The problem is some of these mechanisms such as fuzzy logic existed in consumer electronics decades ago. I remember it being used in marketing buzzspeak even in the 1990s. But they didn't call it AI.

    Also most of the computing guys actually got annoyed and kept telling people it wasn't true AI. What you are describing in many consumer electronics is a system which looks "intelligent" but really isn't and could be more closely described as a "virtual intelligence" . It can only do specific operations between certain parameters,which is realistically what all these consumer products are implementing. They are not showing true adaptation.

    Even now when I catch up with some of them,we still have a laugh at all the stupid buzzwords. Its not only AI,but words like Quantum,Nano,etc. Because apparently the public likes such buzzwords,so now grant applications,company press briefings are as much about the work,and also as much about covering the right number of buzzwords.

    None of these systems described in phones or games are "intelligent",they still require a lot of end user feedback,and are very limited in functionality. So calling them AI is just really marketing speak,for very limited systems,which are running software approximations on hardware used to accelerate certain decisions.

    A GPU or iPhone isn't suddenly "learning" or "adapting" its behaviour,all its doing is running algorithms which have been ascertained elsewhere. It can't moderate or change its behaviour between fixed parameters. A truly "intelligent" or "adaptive" system would modulate its behaviour past those limited parameters. Its just a automaton.

    Yet many of these mechanisms are not exactly new,and were not called AI 20~30 years ago.
    Last edited by CAT-THE-FIFTH; 09-12-2020 at 07:50 PM.

  11. #42
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by CAT-THE-FIFTH View Post
    The problem is some of these mechanisms such as fuzzy logic existed in consumer electronics decades ago. I remember it being used in marketing buzzspeak even in the 1990s.
    Neural networks have been around for decades too. These aren't new.

    And we're not talking Turing level intelligence here, there's no IQ threshold that it has to pass to be considered AI.

    Machine learning does imply some kind of adaptability (resulting from training/feedback), but AI might or might not. I don't see any distinction between virtual intelligence and artificial intelligence, they're synonymous in my book.

    A GPU or iPhone isn't suddenly "learning" or "adapting" its behaviour,all its doing is running algorithms which have ascertained elsewhere
    That's how a lot of machine learning applications work - you build/train the model in one place then apply it somewhere else/to new data. In this case the model is built in the cloud by Nvidia, then you download it to your GPU and run it as a classifier locally.

  12. #43
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by kalniel View Post
    Neural networks have been around for decades too. These aren't new.

    And we're not talking Turing level intelligence here, there's no IQ threshold that it has to pass to be considered AI.

    Machine learning does imply some kind of adaptability (resulting from training/feedback), but AI might or might not. I don't see any distinction between virtual intelligence and artificial intelligence, they're synonymous in my book.

    They're classifiers, which is how a lot of machine learning applications work - you build/train the model in one place then apply it somewhere else/to new data. In this case the model is built in the cloud by Nvidia, then you download it to your GPU and run it as a classifier locally.
    I know and your description of them as AI is not entirely correct(IMHO OFC). Machine learning is a subset of the AI field,it isn't AI on its own.

    The same as AI vs VI,they are absolutely not the same. One is describing a truly adaptive system,and other a system which can never truly adapt.I know they are classifiers,but a classifier does not make it an AI.

    In most consumer products its still a form of automaton - it can't move beyond the limits of the algorithm it runs. It needs human intervention. So there is "no AI" running on these phones or GPUs. Its not even "learning" its running a pre-defined set of commands which have been generated elsewhere. The machine learning isn't happening on the actual device,its dumb. So the marketing which is implying these devices are using "AI" is basically a con.

    Even the whole concept of training WRT to what Nvidia,is doing is very limited too. I knew people doing similar stuff years ago on a more limited scale. Yet,I never saw one person call it AI,outside top level people who wanted to "jazz" things up,so they could get more funding/investment. All of the actual people doing the nitty gritty stuff,didn't like the terminology.

    Edit!!

    https://www.forbes.com/sites/cogniti...ing-really-ai/

    On the flip side, simply automating things doesn’t make them intelligent. It may take time and effort to train a computer to understand the difference between an image of a cat and an image of a horse or even between different species of dogs, but that doesn’t mean that the system can understand what it is looking at, learn from its own experiences, and make decisions based on that understanding. Similarly, a voice assistant can process your speech when you ask it “What weighs more: a ton of carrots or a ton of peas?”, but that doesn’t mean that the assistant understands what you are actually talking about or the meaning of your words. So, can we really argue that these systems are intelligent?

    In a recent interview with MIT Professor Luis Perez-Breva, he argues that while these various complicated training and data-intensive learning systems are most definitely Machine Learning (ML) capabilities, that does not make them AI capabilities. In fact, he argues, most of what is currently being branded as AI in the market and media is not AI at all, but rather just different versions of ML where the systems are being trained to do a specific, narrow task, using different approaches to ML, of which Deep Learning is currently the most popular. He argues that if you’re trying to get a computer to recognize an image just feed it enough data and with the magic of math, statistics and neural nets that weigh different connections more or less over time, you’ll get the results you would expect. But what you’re really doing is using the human’s understanding of what the image is to create a large data set that can then be mathematically matched against inputs to verify what the human understands.
    This is the chap they talked to:
    https://entrepreneurship.mit.edu/pro...s-perez-breva/

    Anyway,we do seem to have diverging viewpoints on this(and probably beyond the scope of all of this thread),so I will leave it at that!
    Last edited by CAT-THE-FIFTH; 09-12-2020 at 08:20 PM.

  13. #44
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by CAT-THE-FIFTH View Post
    What do you think "sharpening" is??
    Now you're making me feel really old. My first job in the 1980's was implementing machine vision using really cheap cameras, so I have a fairly good idea of how you convolve an image to sharpen it as I have had to implement it

    Anyway my point was that you can train a network to generate an image as "sharp" as you want. Why would you train a network to re-draw the screen as anything other than what you want it to finally look like?

    Edit: I think comparing DLSS to deep-fake technology is pretty valid. It is scary what people can knock out with deep fake programs, the techniques clearly work.

  14. #45
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by DanceswithUnix View Post
    Now you're making me feel really old. My first job in the 1980's was implementing machine vision using really cheap cameras, so I have a fairly good idea of how you convolve an image to sharpen it as I have had to implement it

    Anyway my point was that you can train a network to generate an image as "sharp" as you want. Why would you train a network to re-draw the screen as anything other than what you want it to finally look like?
    You would also know,that its kind of a cheat in modern digital output,as its just utilising aspects of our vision,just like our green sensitivity(Bayer array). Also I might have not worked on it during the 80s,but did a reasonable amount of stuff a while back on image analysis automation,on certain rather microscopic things!

    And you made me feel even older as I can remember when "digital" was something sci-fi,and autofocus was a hit and miss technology!

    Anyway TBH,why even bother to train it that much,if Nvidia themselves admitted that the whole per game training was a failure,and its much more generalised. Why?? Because of time constraints.

    Nothing stopping them pushing more generic upscaling algorithms(which cuts down on training time),and then amp up sharpening,contrast,etc and leave it at that. Remember,PCMR mocked the consoles for more generic upscaling techniques,but apparently Nvidia is basically moving closer towards that.

    Sure they can do what you are saying,but again Nvidia themselves have kind of admitted time is a problem here,and its not like most of the target audience will bother to examine any claims in detail. Nvidia is after all another Apple. After all it works incredibly well with phones - some tests were done by a YouTube channel where it seems smartphones with more natural image processing appeared to lose out to those with more exaggerated processing. So I would argue its quite possible for Nvidia and indeed AMD to take some shortcuts here and pass off stuff as something it isn't.

    As long as they use enough buzzworks(maybe they will add Nano and Quantum too?),and say its "better than reality",its all good right??
    Last edited by CAT-THE-FIFTH; 09-12-2020 at 09:06 PM.

  15. #46
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Anyway TBH,why even bother to train it that much,
    Because they invested all that time and money into tensor units and need to justify them?

    I'm not saying this is the right way to do it; heck the upscaler in my really old Samsung TV for upscaling from SD to 1080p is amazing using just temporal information from previous frames, But that stops working if you pause the source material and give it nothing to work with. It also isn't enough to make me part with money, but I do think DLSS is a neat idea with a possibility of further improvement over time.

  16. #47
    Senior Member
    Join Date
    Jun 2013
    Location
    ATLANTIS
    Posts
    1,207
    Thanks
    1
    Thanked
    28 times in 26 posts

    Re: AMD Radeon RX 6900 XT

    ....in the distant future there will be no need of up-scaling because your mind will be part of the compute stack (MATRIX)

  17. #48
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: AMD Radeon RX 6900 XT

    Quote Originally Posted by lumireleon View Post
    ....in the distant future there will be no need of up-scaling because your mind will be part of the compute stack (MATRIX)
    There's no need to up-scale your mind but i need all the help i can get.

Page 3 of 4 FirstFirst 1234 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •