Either you've completely ignored what's been said here or you genuinely believe that every GPU must be priced based on its performance in theoretic games years after its release date at maximum quality. That's crazy. You're talking to someone who uses a 4GB card to play GTA V at 1440p regularly - GTA's not a new game but it is a worst case scenario for Fury, for a variety of reasons, and I'm averaging 80-100 FPS at high, with a VRAM capacity you're suggesting is "nowhere near enough".
If you only play games with ALL OF 8x MSAA / SSAA, maxed out textures, bloom, godrays, motion blur, and so on; then sure, 6GB isn't enough even at 1080p. But you're saying that games are impossible to play otherwise, which is a fallacy. Again, choosing to turn on everything, is a premium concept kept for premium price tiers, not a 1660Ti, not an RTX 2060, but an RTX 2080 or better. That's why they get the memory.
Why? Aside from the fact this card would have perfectly reasonable functionality at less than extreme quality settings, who exactly is going to stop NVidia or AMD or anyone from setting limitations on their cards as they see fit, at pricing tiers they choose to set? These companies exist to make money. Not provide the perfect product, always. You must know that low-end cards have traditionally had variants with ridiculous memory capacity they could never realistically make use of (with bloated price tags, no less) and you haven't complained about that?
Prove to us that you could get the same performance 3 years ago at $200 brand new, that $200 now gets you.