Should be perfect for rendering.
Should be perfect for rendering.
3090 results. This site did have 3080 numbers which were accurate too.
https://videocardz.com/newz/nvidia-g...view-leaks-out
Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.
Not the killer we were expecting if true, is this entire Ampwhere launch just to get new cards out first for those that have to have the newest fastest thing rather than it really being ready to go, if AMD kick this one off right, withe decent stock levels they could reaaaaly upset NVidia, starting to be really glad I couldnt get a 3080 now...
So big navi will have 16GB? If it was an 8 GB card there'd be no need for nvidia to up capacity
I should imagine they'll launch with 8 and 16Gb variants.
If those 3090 results are anywhere close to true (which I don't doubt to be honest), there's no way that's worth ~£750 over an MSRP 3080.
If the stock 3080 in on Samsung, do we think that the Ti's will be on TSMC, might increase performance a smidge and not need as much power, wondering if the call to go to Samsung was simply because theres no capacity at TSMC and Samsung offered NVidia a decent deal on GPU/RAM....
The more I see of the 3080 and rumours of the 3090, I'm wondering if the 3080 isnt really ready and it needs the power to get the performance where it might not on TSMC...
That really depends on what you want the card for, the premium probably comes from not only having the full amount of CUDA cores, but also the extra memory to go with that. We have no idea what the yields on the 8nm Samsung node being used are like either, that may well be a factor for the full fat ampere chip.
Depends on what you use the card for....even Nvidia said the 3090 was aimed at 'content creators' and they do have a different set of requirements to a gamer (I know I do as a 3D designer) and in all honesty £750 for someone in my field is 'peanuts' in the grand scheme of things.
Having said that I'm pretty sure that the primary focus with ampere was on cuda performance along with interaction with ssd's for loading etc which I'd even go so far as to say that current games might not be able to fully utilise ampere yet. Nvidia have openly said that AI and gpu processing etc is important to them...not to mention far more profitable than gamers.
Mind you I don't personally see 8-10% at 4K as anything to sniff at seeing how 4K can basically make even high end pc's struggle.
Definitely interested in one of those 16GB 3070 cards, just hoping that the pricing remains sensible. No idea what strat Nvidia is playing at though, hopefully they're out before the new year as I'll be very keen to upgrade.
The extra vRAM really isn't needed for gaming - they're might be the odd title in 4k that might take you over 10GB, but anything more is just wasted allocation imo.
Maybe next architectural update, we'll truly need it by then!
As some others have said, i'm not too sure why people are so fussed over the VRAM, usage is barely hitting 8GB for most games atm and i dont see this changing massively.
There are currently 1 users browsing this thread. (0 members and 1 guests)