So this isn't Navi, it's Vega on 7nm, correct?
So this isn't Navi, it's Vega on 7nm, correct?
"In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."
I guess it depends on what your priorities are, Nvidia have real time ray-tracing but to use it, currently, you may need to lower the resolution, AMD seem to have better performance in Vulkan games but maybe a bit thirsty, and when it comes to 4K it looks like a draw so i guess it comes down to what you value more, decent 4K gaming and higher FPS in Vulkan titles or decent 4K gaming and lower FPS (because of extra eye-candy) in RTX titles.
If i was looking to game at 4K I'd probably go for AMD as having to lower the resolution, or cut my FPS in half, on a 4K monitor would be a PITA.
If i was looking to game at a lower resolution I'd probably go for Nvidia as I'm not sure going from 100+FPS to 60+FPS would be that noticeable.
Last edited by Corky34; 10-01-2019 at 01:12 AM.
Sounds like they are pushing this card as a low budjet computing card primarily with the bonus feature of gaming. Probably this is just a stop gap and looking to just raise some funds prior to Navi or whatever
For anyone using their PC for production (commercial OR amateur) the Radeon VII is a godsend. Double the memory of a 2080 at the same price AND higher performance is genuinely amazing. Can't wait to see GN do benchmarks for this thing
Forget mindshare, pointless to chase that. CHASE INCOME (net!). I wish they had went with GDDR5x (cheapest) or GDDR6, not kill your card YET AGAIN with HBM/2/3. How many gens will we go where they can't make money due to the mem chosen? AMD went GDDR5x last gen, GDDR6 this, and 2080ti is fine! Why HBM for a card not even taking on the KING of the hill? WASTED NET INCOME. Stupid. 1.5% play at 4k. Again, stupid to even consider these users today. Make more margin by recognizing your audience and building a card that fits those users (1080p/1440p is what we use, and massively below these two). Even 1440p only had 3.5% market. Almost not worth chasing them either. 4k benchmarks are a waste of time, quit using it as a claim to fame for gpu launches. Make cards that play great at the resolution people are USING. Then you won't have HBM production problems limiting sales, no HBM COST problems killing your margins, etc. HBM only worth it in pro/server cards where margin is ridiculous and perf is actually needed all day (thus worth the cost).
QUESTION: Is Nvidia in trouble for pushing to far on the deep learning 'MARKETING' but custom processors from competition (Google, Intel etc) are attacking their margins? "In terms of raw performance on ResNet-50, four TPUv2 chips (one Cloud TPU) and four V100 GPUs are equally fast (within 2% of each other) in our benchmarks. We will likely see further optimizations in software (e.g., TensorFlow or CUDA) that improve performance and change this" ...Article SOURCED FROM: https://blog.riseml.com/comparing-google-tpuv2-against-nvidia-v100-on-resnet-50-c2bbb6a51e5e
I expect the fact they are now selling this as a consumer card when before it was only for server use means there aren't HBM production problems any more.
That and the fact that 2060 with 6GB of GDDR will be selling at about the same price and performance as the old Vega 56 cards with 8GB of HBM2.
Blimey, makes the 2080 appear like good value! Less features, same speed, same price? Even if they'd undercut it by £50, it would make it a little better value
XBOX Live - Sheep Sardine | Origin - MrRockliffe | Steam - MrRockliffe |
Add me
Spud1 (10-01-2019)
The bit that made me sad was the "same power envelope" ...as the Vega 64 presumably, so this thing is going to use loads of power sound like a vacuum.
RTX 2080 power around 225W:
https://www.tomshardware.com/reviews...n,5809-10.html
Vega 64 power around 275W:
https://www.tomshardware.co.uk/asus-...w-34379-4.html
Last edited by cptwhite_uk; 10-01-2019 at 10:41 AM.
Well, lets all calm down and see what we know:
- New card , Radeon V II(?) or 7 has higher peak clock (1800 vs 1536), has double number of ROPs (128 vs 64, known to be bottleneck on Vega 64), has double the speed of its memory! and it has 16Gb vs 8Gb. So there is definitive improvement over older card.
- 16Gb of RAM is more useful than RT cores at this point, especially if you do play in 4K.
- I hope everyone understands that there is no full real-time RT game engine or game for that matter. It is just some small parts of the image (shadows and reflections mostly). For that you pay in much lower frame rate. So, this tech is not just up there, and will take years. I think AMD is focusing on CPU side, and wants to have the right next gen GPU chip. So, they use what they have now.
- Price wise, you are getting workstation grade graphic card for the same money as the consumer. 16Gb of HBM are not cheap. If you want 4K playing, I think Radeon has the edge. If you want RT bits in your game, happy with 1080p resolution then 2080 is way to go.
The more you live, less you die. More you play, more you die. Isn't it great.
The issue with that logic though is that which developer in their right mind is going to make 16GB HBM a necessity in the near future when games don't even quite need 8GB yet, and the most popular selling cards are still kitted with 6GB? By the time a 16GB HBM card will have any real benefit in the real world, this cards core clocks will be too slow. It is great for professional work and it looks good on paper for marketing, but the reality is this is looking like a 2080 without the benefit of tensor and RTX cores for the same price, DLSS is going to make even the 2060 a viable ray tracing card at 1080p, so a 2080+ should be happily running on 4K screens with DXR and DLSS in the near future, and then there is the option to not use DXR and still use DLSS which would bring performance beyond what this card can do anyway, throw a good freesync monitor into the mix and AMD are DOA with this card for consumers IMHO.
RX Vega 56/64 are bottlenecked by a shortage of memory bandwidth as it is. AMD has no choice but to use HBM2. If AMD went back to GDDR you would cripple the performance of the Vega GPU. Look at the benchmarks and you'll see overclocking the memory gives a significant boost to RX Vega.
Using GDDR with Vega 7 would be impossible. You would need an extremely wide memory bus to feed the enough bandwidth to the GPU. That bus would be really power hungry. HBM2 solves this issue for AMD... at a cost. With Vega 7 doubling the amount of memory and the memory bus width we will finally see a Vega GPU reach its full potential (even if it will be a little underwhelming).
AMD has been unlucky to date. They backed HBM2 only to find the expected price sky rocketed because of a general capacity shortage of memory of all kinds and the fact only one HBM2 manufacturer (Samsung) was operating at mass production. With Hynix up to speed now hopefully that will drive the price down to make Vega 7 more profitable.
His point makes perfect sense to me. You can benefit from having more then 8GB of VRAM today in many games when playing at 4k. The GTX1080Ti at 4k has more consistent frame times then the RTX2080. Simply because the GTX1080Ti has 3GB more VRAM.
Ray Tracing is the future, no doubt. It does little for today's games. Its just not worth the artifacts or performance hit. DLSS is still to be released and tested. Its a complete unknown.
There are 100+ games with high quality 4k texture packs or mods available. Compared with 1 DXR and 0 DLSS.
To me that 16GB of memory would appeal to many 4k gamers as darcotech suggested.
I guess we will have to agree to disagree with this one, no idea what you mean by artefacts as that only appears to happen on failing cards (I have not seen any whatsoever on my 2080, but it does have a custom PCB) and those 100+ games work perfectly fine in 4k on a 2080 too, and will run even better with DLSS.
There are 2 games with DXR at the moment (albeit one in a different market) and FF already uses DLSS, also BFV is looking to get DLSS very soon too PLUS Resident Evil and Metro are coming within the next month also so your numbers are a bit out there and that is without including benchmark apps such as port royal and the asteroids demo. I am not poking at you here though so please try not to get offended, I just don't want others to read this and be misinformed.
There are currently 1 users browsing this thread. (0 members and 1 guests)