Read more.Primed for superior QHD gaming, the company says.
Read more.Primed for superior QHD gaming, the company says.
In the absence of a competitor that has enough VRAM and decent ray-tracing performance at 1440p then probably, yes. Seems like decent ray-tracing is still at least a generation away for these mid-range cards from either team so you might as well focus on rasterisation, and in that respect, this card appears to hit the mark. Reviews will hopefully tell all, however.Originally Posted by hexus
Not that you can get one for MSRP but, if MSRP is anything to go by, $20 less than a 3070 doesn't seem great value, then again IDK what sort of price people would put on having a card that's capable of ray-tracing.
Still it's good to know reviews will be going up the day before launch so people won't be buying blind.
Way too expensive IMHO,especially as UK street prices won't be anywhere near what AMD purports it to be. Also consider this - the 5700XT RRP was $400,so that is a straight 20% increase in price at one tier.
So that will make the RX6700 6GB(apparently it has less VRAM),around £360,which is close to a RTX3060TI RRP. That means even if it is slightly higher in rasterised performance,it will have less VRAM too.
Another issue is the TDP. The RX6800 16GB is 230W,and the RX6700XT is 225W. This really indicates to me the RX6700XT is pushed very hard to try and compete with the RTX3070,so I wonder if it will be actually worse overall??
Hmmm 230W for this card. I find the 170W of MY RX-580 too high due to the noise from my uATX cube. So much so I've limited it to 1215 MHz and undervolted it to keep it below 100 watts. These 7/8nm cards have all been disappointing so far TBH.
"In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."
Ulti (04-03-2021)
From what i gather the 6700 XT is based on new silicon so I'm not sure how useful trying to draw a comparison would be, then again IDK what new silicon means exactly, i assume it's a different version, maybe a performance version given the higher clock speeds.
Edit: Just checked and i think you may have the 6800 TDP wrong, the article lists it as 250W and the 6700XT as 230, granted relatively it seems to be using more power what with fewer bits & bobs.
Edit, edit: Oh bother you corrected my first edit in your answer, sry.
Last edited by Corky34; 03-03-2021 at 06:35 PM.
I'd actually be interested to see how this card fairs against the previous generation of Nvidia cards, specifically the RTX 2080 and even more specifically in ray tracing performance (which I had no issues with on the 2080). Personally I think Nvidia have the solid lead on RT features, but perhaps without that all of these cards are a lot closer to each other this generation around. It'll be interesting to see how long it takes for AMD to catch up with the lead Nvidia has in RT performance.
For the power consumption aspect, I completely agree. Leaving these cards unleashed consumes a lot of additional power that a lot of people may not require (depending on monitor refresh rates). I deliberately limit my frame rates on my 3090, which makes a massive difference on power consumption, but little to no difference on playability of my games.
I think its using leaky GPUs clocked to the edge,so AMD can somehow compete with the RTX3070 and justify the price hike. Its really weird that the RX6800 has a 250W TDP,with 60CUs and 16GB of VRAM,and a larger 256 bit bus. The RX6700XT with 40CUs,a narrower bus and less RAM chips,etc has a 225W TDP.
Doesn't the 6800 and 6800XT both supposedly overclock like a bat out of hell? So this 6700XT maybe already is clocked to the limit to try to make up the difference of the 20CUs. It's bad value either way because for not much more you could (on paper) grab a 6800 while the 3070 and 3060Ti could be better value overall.
AMD's history suggests they're pumping voltages up to 11 to increase yields (and overcompensate for not-so-great PSUs out there).
The RX6800 at least looks like it is artificially limited in the BIOS. My main concern is if the RX6700 is a 6GB GPU as the RX6700XT rumours were correct. That means it might be slightly faster than a RTX3060TI for RTX3060TI level RRPs. But an RTX3060TI has more VRAM,and better RT performance.
It appears AMD is now the premium brand,so if mere plebs like me want a GPU,then Nvidia probably is going to be a better choice. A bit like their Zen3 CPUs,where Intel Cometlake is now the budget choice(faster than Zen2).
My even bigger concern is how much faster these GPUs are going to be against a PS5 or XBox Series X GPU.
Last edited by CAT-THE-FIFTH; 03-03-2021 at 07:00 PM.
Faster?
Navi 22 is 2,560 shaders, Xbox Series X is 3,328 shaders, PS5 is 2,304 shaders.
All the other metrics the consoles are pretty close too with Navi 22 mostly loosing to the Xbox X.
TPU have most the specs on their database:
https://www.techpowerup.com/gpu-spec...es-x-gpu.c3482
https://www.techpowerup.com/gpu-spec...on-5-gpu.c3480
https://www.techpowerup.com/gpu-spec...-6700-xt.c3695
Price is almost the same, but with the consoles you get Zen2 CPUs plus storage etc.
Yes, the are loss leaders sold at below cost (it's not just about the BoM as Sony and Microsoft also paid for a lot of R&D and development upfront), but for the AMD the margins must be tiny plus the console are eating almost all the TSMC wafers.
CAT-THE-FIFTH (03-03-2021)
This is my issue, I'm not going to pay £350 for a gpu which isn't better than the ps5/xbox, And I don't want to spend more than £350 on a GPU!, that said, I did say I would never pay more than £300 for a cpu, but dam that 5800x at £375 I couldn't say no in todays market....
Ulti (04-03-2021)
I think Cat has it right about the high frequency but slightly leaker chips - the power/cooling budget for the card GPUs means they can go for frequency - and they have if they're talking near 2.5ghz clocks, so I think it'll beat out both consoles (not accounting for close to metal console programming however). So for that reason I also think they've been smart and are using chips that wouldn't otherwise go in console (and likewise consoles are using chips that don't make the card's frequency target).
While probably priced accordingly based on pc gpu performance, I can't help feeling this is too expensive for what it is even if we assume for a few seconds that this will retail at anywhere near the mrrp.
Last edited by LSG501; 04-03-2021 at 12:21 AM.
There are currently 1 users browsing this thread. (0 members and 1 guests)