Originally Posted by
DanceswithUnix
I'm not disagreeing, but I do think it is a bit more complex than that. What if your blu-ray player was programmed with the 4k version of that dvd, so if it saw a DVD frame it could match it to the corresponding 4k and display the 4k version of the frame. Are you now watching a 4k movie? It will look like you are, because the player has extra information to draw upon, its local 4k copy of the movie. Now what if you compress the 4k version to save space? That's a problem we already get, that lossy compression means we aren't actually watching 1080p or 4k content but more something that aspires to that resolution. DLSS uses deep learning to manage the frame mapping and compression, and to cope with games not showing fixed frames but a slightly different set of frames for each player. It is clearly lossy, but it is also clever enough that I don't think it can be simply dismissed.
Having said that, it isn't going to make me go out and buy an Nvidia card.