Read more.The first custom GTX 1080 Ti put through the wringer.
Read more.The first custom GTX 1080 Ti put through the wringer.
So the reference throttled back to system wide power draw of 260W in the stress test as the temps got toasty but as this card doesn't throttle back its pulling almost another 100W!
Really shows the extra power needed to get the last little bits of frequency increases out of card!
Would love to see what one of these could do under an AIO water cooler or as part of a full water cooled rig.
I know I really wanted to see what Vega had to offer before splashing the cash but I'm getting very impatient and almost tempted to pull the trigger for a 1080Ti instead.
I still only regard these as the ultimate cards for 1440p gaming.
Some games maybe playable at high 4K frame rates but the fact they dip below 60fps in many suggests they are at the limit and have no real headroom for future, more demanding titles.
Buying any card for 4K gaming is still a mugs game, I've always said that true 4K cards are still one or two generations away.
Or already available but just being held back to squeeze more coin out of people as it is, doesn't matter to me though if it is a Nvidia or AMD card though, whichever is best but would have to jump at least one more generation to even consider replacing my current 1080 with something better than 1080 Ti.
Agree with bagpuss. These are 1440p cards, won't see REAL 4k (meaning nothing having to be turned down, fully seen as dev wanted) probably until 10nm (well, maybe 12nm volta I guess if that process is REAL).
No matter though, unlike review sites pretend, barely anyone REALLY is playing on a 4K monitor. 95% of us are using 1920x1200 and below still, and worse, 1/2 of us are on win7 (which can't use dx12) but they benchmark in win10 for <25% of their audience ROFL.
"real 4K"... sorry but this is chasing a rainbow. The proportion of the market that want 60Hz minimum at 4K is itself a fraction of the overall market for 4K, and a lot of enthusiasts are now crying for high-frequency monitors aswell, so next the bleeding edge demand will be for 120Hz or whatever.
What you're missing is that as the cards get faster, the game devs make the games more complicated, they use up the extra GPU power with more detail, newer effects, etc. There's also the issue of lazy coding, devs allowing raw GPU power to deliver higher frame rates instead of simply writing better code.
This is why an unspoken section of the market likes to buy high-end cards, not for massive frame rates on the latest titles, but so they can run slightly earlier games they're still playing maxed out, with a genuine 60 minimum, typically at more medium resolutions. In some cases, moving up to 4K also regains performance since AA can often be turned off.
There are currently 1 users browsing this thread. (0 members and 1 guests)