Read more.And pictures of the Nvidia Founders Edition and two Zotac SKUs are shared on the web.
Read more.And pictures of the Nvidia Founders Edition and two Zotac SKUs are shared on the web.
Maybe it's the season, maybe it's the sleep deprivation, maybe it's the opiates... I do like this card and I think it'll do well, deservedly so. Just needs that little bit of a price drop and that'll probably happen when 1070 stocks deplete sufficiently. This is the kind of generational leap (from 1060 to 2060) in terms of performance that I want to see across the range. I suspect that these are optimistic benchmarks and are unrealistic given most people won't be daft enough to spend 1K on the CPU and then get a mid range GPU but at least the testing appears consistent across the board making the comparison useful if not the measurements in isolation. I really think Nvidia should have been more sensible with their releases and either released the range at once or at least released the 2080Ti alongside the 2060 to show that, whilst yes there is a stupid price premium on the top end stuff, they are still releasing perfectly serviceable cards for the mid range with a good improvement in performance over the last generation and you don't have to be a millionaire to game on PCs.
What is missing is the quality settings they used. I am DEEPLY sceptical that those "RT on" figures are with high quality settings. At which point it's a matter of which do you sacrifice? Texture quality or lighting quality? Could make for an interesting analysis.
The big problem from what we read elsewhere is that there will be a stupid number of 2060 models, so consumers will think they're getting the performance shown in the table above, but may get substantially less (based on previous Nvidia outings). It will be fairly hard for consumers to know, unless they're really clued in. I'm pretty certainly all the reviews will be the "top of the pile" cards!
(I think another hexus article said the one manufacturer had 26 different 2060 models!).
Well now this will be interesting especially when it comes to the Laptop range, it even performs better than the 1070 and almost similar to a 1070Ti, which is great now all we need to know is proper benchmarks and whether if its still worth hype that some still hope for at budget.
My concern is pricing, which remains a bug bear with this range. This may be a leap forward from the GTX 1060 with performance matching the 1070Ti, however, if prices also match the 1070Ti then this is another fail. Ignore the model number and consider performance/cost, in which case it does not appear to me as a step forward .
You're right I think. It's going to be like the STUPID and seriously annoying 1060 branding which could relate to any number of different GPUs and then different RAM configurations on top of that. It's almost as bad as Samsung giving Enyox (?spelling) processors to a large area of the globe and Snapdragon ones to the rest (Enyox ones are really dire compared to the SD ones these days although I think optimisations means the phones don't suffer too badly) but they give all the reviewers the SD ones and they call them all the same model, even though the very core of these things is different with markedly differing performance metrics.
You go and look at a review of the 1060, think it's for you, buy one off the shelf and you get a different card to the one you've read the review of and the benchmarks don't relate to what you're buying. Changing the quantity of memory is fine because you can specify that on the box but changing the core of the thing is just misleading and wrong.
Goes toe to toe with my 1070. Too bad it only has 6GB. Might be worth buying with 8GB for the right price.
That would only make sense if the pricing were in any way comparable. If as reported this is a 2060 labelled product at a 1070ti price point (you can buy a 6GB 1060 for £220 and that seems steep compared to previous cards) then Nvidia are going to be generating a lot of anger. Again.
Best get ready for lots of anger then.
No SLI support, makes it a no-no, whilst it'll be ok for a while, it'll soon be too slow and need beefing up. Adding a second card down the line is a good way of doing this, but now only the 2080 has support for SLI this route has been closed for nvidia.
If they can pull of 65fps in a game that is NOT made for RTX features but rather patched in AFTER the fact, then I think this is a great 1080p card for the future (which is still 65% of the market) and probably 1440p if you have no interest in this new stuff. Games of the future that are MADE for this tech, should do far better than 65fps unless your coders made an app that just runs like a pig & we've definitely seen a few of these over the years. Of course if you're into online gaming (rooms full of lots of people), I'd aim far higher fps for my given res. IE, if I was gaming online with 1080p, I'd want 100fps in EVERYTHING or better, as a room full of people can bring your gpu to the ground easily (maybe not games like minecraft..LOL, I'm talking shooters with crap flying everywhere, marked walls etc). Then again, you can always turn things down I guess, but that's against my gpu religion My bible says something like "ye who turns down settings in such a way that is NOT like the dev intended, will be struck down forthwith"...LOL..Something like that I'm sure.
If you're price conscious, just wait for 7nm AMD and you'll get RTX at a discount probably unless they really suck (not really possible at 7nm, IMHO, even for AMD). If you're not, a good deal vs. 1070ti with the new features not to mention beating my watts by 30 is a lot (1070ti here), and nice perf over 1060 for sure. Problem with the reviews? Get the card you READ the review on. Surely you can read PART/MODEL numbers?
Core changes are fine if they come with SE (er, sh1tty edition? LOL), LE (luxury ed?) etc. But I'd have a problem with them ALL being 2060 with no way to tell them apart. I doubt they'd do that, bad for business as people have tons of outlets to whine on today. All 4 GB of memory was available on NVs part before, look how many freaked over 3.5+.5gb slower (still all functioned, but fake news everywhere, still today people say it). Better off just not going down the road of feeding media/idiots who can't seem to read specs etc, or at least put out the WHOLE info so it is what it is then. Myself, 7nm or DIE. That said I hope they make a ton on these, it just means more R&D I'll love later. Without someone making money in the industry (cough, amd, cough), we wouldn't get new features for a lot longer (ray tracing etc).
I hope Intel does better this time around, but based on the past, I have little faith seeing all their gpu failures before (even recently). I think they have the hardware people, but I have zero faith in their drivers being released for all big titles etc like NV (and to some extent AMD - hey my amd card didn't get a driver for well over a year so...still stings). They will have to prove it for a gen before I'd even ponder a discrete Intel card. I sold i740 back in the day, and had to make excuses for the drivers a LOT. It's not like you can tell your customers, "I told you not to buy that crap". In the end of it I did start putting on the invoice "not recommended"...LOL. I'd hoped telling them I'm going to do that, because they'd be calling me about it, would stop them (only changed 2 minds...Intel reputation won almost always...ROFL). If nothing else, it killed their attempts to call and complain, since it's right there on the invoice, not my doing
Don't forget people the 1070ti went for over $500 (I paid 509 USD) and I wouldn't try to put it in a HTPC unless it was a pretty big one. I still would wait for 7nm versions. What would that be for the same card, 110-120w? Probably could crank it a bit and still be 120w. I really hope they just PURE shrink all these for quick 7nm's with not much design time and just crank them up a bit and stop when the watts start to rise (meaning max them to the same watts as the 12nm). That's all it will take to blow by AMD 7nm if they are really aiming at GTX 1080/2070 fps (that sucks, too small of die IMHO to be fast but amd probably keeping yield hopes high), and price the 12nm down to AMD stuff. No point in releasing big chip tech until prices on fab work comes down at 7nm. I'd like to see us start coming down from 250-300w cards, but no matter as I buy where I want my watts at anyway or slightly above and just turn it down 5-10% to meet the heat I want. Just knocking 5-10% off the gpu clock takes your watts/heat down massively usually, same for cpu if needed (you can always turn it back up for some games that require it).
Happy New Year everyone...
uk PRICES now equate almost directly to Dollar prices , due to the brexit farrago driving down the GBP.
Other Guy Samsung exynos , QUALCOMM Snapdragon ..
Hope my recently buyed GTX 1060 6G will be a strong in next 5 years
There are currently 1 users browsing this thread. (0 members and 1 guests)