Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Out of curiosity and because i don't even know where to start with maths, i was wondering, roughly, how much more performance in games would have been gained if they hadn't segmented Turing?
If all that die space they dedicated to INT32, RT 'cores', and Tensor 'cores' were replaced with a load more of the old style mixed precision CUDA 'cores'.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Well my RTX2080 is now out for delivery, will be here today :)
If you are hating on it, enjoy your anger and frustration while I will enjoy a nice upgrade over my existing GTX1080. Sold my 1080 for £350, so a net upgrade cost of £400 for a 40-50% boost (before DLSS is taken into account) is awesome.
edit: just arrived in fact! Now I face a long 6 hours left at work! glad I came in early today :)
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
The amount of criticism on Hexus is utterly tame when compared to OcUK and even the Nvidia Reddit seems to have criticism too(!).
I started up the RTX series review thread on OcUK and it hit 10k views yesterday alone and most of the posts were criticisms with one owner trying to fight loads of posters too. TBH,this has not only been a weird launch but not seen such general negativity for a while. There was some moaning about Pascal pricing but most of it was the lack of proper stock for months since it sold out.
Edit!!
I can see why - first people expected Vega would come in and help drop prices but it was a flop so that didn't work out.
Then mining came and prices jumped high.
Then mining went down temporarily and prices dropped to launch levels.
Then people expected the new gen would probably drop old gen prices a bit like what happened with Pascal.
Except the new gen is so high priced relative to its launch performance that Pascal does not need to drop much.
If you look at the deals section at the end of last year there were as good or even better deals on say a GTX1080 than now for example.
Then RAM pricing has only dropped slightly so perhaps it's just a general sense of frustration methinks.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
OcUKs forum being saltier than Hexus? I'm shocked :) :) :)
I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!). I always find it odd that people expect major companies to sell their products at a loss or break even point .
You are probably right that its just general frustration about the fact that these new cards are more expensive than the last generation. That's made worse by people trying to make direct comparisons which is confused by Nvidia's branding.
There are high end cards with a high end price tag - not really "mainstream" so to speak but that's OK imo - the mainstream cards will come later at much cheaper prices.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
Spud1
I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!).
I thought margins had been continually increasing?
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
kalniel
I thought margins had been continually increasing?
I would expect as a company it would do - particularly as margins on the GTX ranges will have improved and Nvidia still sell a tonne of older cards at huge margins - but I've not seen anything to say that the RTX range has an improved margin. They are at something like 30% (for the company as a whole, not on one particular range!) net atm iirc.
We'll find out more when we get more teardowns and chip analysis of these individual cards.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
Spud1
I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!). I always find it odd that people expect major companies to sell their products at a loss or break even point .
Nvidia gross margins are now more than Intel:
https://ycharts.com/companies/NVDA/gross_profit_margin
https://ycharts.com/companies/INTC/gross_profit_margin
Nvidia net margins are more than Intel:
https://ycharts.com/companies/NVDA/profit_margin
https://ycharts.com/companies/INTC/profit_margin
Enthusiasts on tech forums for years were defending Nvidia's higher prices at each generation. Nvidia's net margins used to be between 10% to 20%,but are close to 40% now.
Intel net margins used to be 15% to 20% but now are 20% to 30% now,so apparently Intel has more "reasonable" prices relative to production and R and D costs! :p
R and D costs might be a consideration,except for one thing - it appears the professional/consumer line split which happened at Maxwell,where Nvidia developed two different lines,ie,when focusing on FP32 workloads(gaming cards) and that for non-FP32 workloads(commercial) has ended. Now they are starting to go back to the old way of having one single line of GPUs. This alone will help reduce R and D costs,and also the costs of chip tape-out.
This is what one or two said here before(Corky I believe was one of them),Nvidia has found a way to shoehorn commercial features into games. Hence they will progressively drop cards with an FP32 based focus which all more recent gaming cards to this present day have become.
Edit!!
Also that rumour of production costs?
Are you regurgitating Wccftech?? It came from there and was something they made up!! :p
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
Spud1
....but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!).
We do? I didn't know Nvidia have said what the build costs are, i thought it was speculation that they're costing a lot to build, speculation i personally disagree with as the RTX's are just down binned Volta's that would have ended up in landfill if they'd hadn't worked out a way to make the unique features of Volta relevant to 'gamers'.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
Corky34
We do? I didn't know Nvidia have said what the build costs are, i thought it was speculation that they're costing a lot to build, speculation i personally disagree with as the RTX's are just down binned Volta's that would have ended up in landfill if they'd hadn't worked out a way to make the unique features of Volta relevant to 'gamers'.
From Wccftech:
https://wccftech.com/nvidias-next-ge...ature-details/
No link to sources so it was probably speculation by them. Remember,it was said Pascal cost a lot of money too due to the node shrink and expensive GDDR5X,etc but Nvidia margins grew too.
Edit!!
Quote:
Originally Posted by
Spud1
There are high end cards with a high end price tag - not really "mainstream" so to speak but that's OK imo - the mainstream cards will come later at much cheaper prices.
That is the problem there. Normally a higher product does not bother me,but with graphics cards,it sets pricing at the lower end.
The RTX2070 will come to nearly £600 - so unless Nvidia has a sudden £350 gap to the GTX2060,its going to be one of two things:
1.)The GTX2060 moves upto closer to £400
2.)They split the 60 series line,so a GTX2060TI,GTX2060,GTX2060SE,etc
If the current pricing tier holds,the 60 series will eventually be shifted to the £400 mark. The $250 to $400 mark has been where the 70 series has existed for generations.
So all the websites will nicely compare 60 series to 60 series saying its a great performance bump,but for most mainstream purchasers who are more budget locked,that £200 card to £200 card upgrade might not look as hot anymore,as the range has been spread out.
So either stump up the extra cash for a decent upgrade or wait longer.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
I put the price increase down to the massive stocks of 10 series cards, not to mention them still being very viable. In generations passed, the previous cards tended to be struggling by the time the new ones come out. The 10 series is very much still an extremely potent card, and for all this talk of "finally a true 4K60 card" the 1080Ti is absolutely fine for 4K.
They have no reason to lower the prices because the old cards are still very much worth their money, so they're treating them as current; instead of a decent performance boost for a modest price increase, as is tradition, it's more like an even higher powered 10 series card for even more money.
That and I definitely feel like we're paying extra for tensor and RTX just so they can get it out the door and work on it in future releases, to get us used to it now.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
So it looks like 1440p is now finally going to be playable (ie 120fps+) for most titles. A while to wait for 4K it seems.
:)
But at this price? No chance.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
CAT-THE-FIFTH
This is what one or two said here before(Corky I believe was one of them),Nvidia has found a way to shoehorn commercial features into games. Hence they will progressively drop cards with an FP32 based focus which all more recent gaming cards to this present day have become.
But the 2080 cards still have an fp32 focus. Volta could do fp64 at half the rate it could do fp32, like you expect from a commercial focused card. The 2080 can do fp64 at 1/32 of the fp32 rate, like you expect from a consumer card.
Whilst I'm sure the tensor cores will be what they developed for commercial users, their inclusion does not make it a commercial card. It just means Nvidia think the feature is worth the sacrifice in silicon area over putting more shaders in. Then there is the ray tracing support; is there any support for that in commercial render engines? Something that gives an iffy quality lighting system good enough for action games isn't likely to impress the likes of Pixar in rendering their latest movie where every pixel should be spot on.
So AFAICS this is a consumer part, probably a Volta with the FP64 stripped out and some raytrace tech added giving a slightly smaller die that GV100. Given Volta and Turing are both 12nm products, I wonder if Nvidia have done exactly the same commercial/consumer split as before they just staggered the release.
As an aside, there was a die shot of a Turing compute unit that implied a quarter of the area was for tensor cores and a quarter for RT, so they could have had twice the shaders if they cut those out and scaled up the number of CMs to fill the space. I have to wonder what that would do the things like the anti-alias performance if it could generate sample spots at twice the throughput.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
DanceswithUnix
But the 2080 cards still have an fp32 focus. Volta could do fp64 at half the rate it could do fp32, like you expect from a commercial focused card. The 2080 can do fp64 at 1/32 of the fp32 rate, like you expect from a consumer card.
Whilst I'm sure the tensor cores will be what they developed for commercial users, their inclusion does not make it a commercial card. It just means Nvidia think the feature is worth the sacrifice in silicon area over putting more shaders in. Then there is the ray tracing support; is there any support for that in commercial render engines? Something that gives an iffy quality lighting system good enough for action games isn't likely to impress the likes of Pixar in rendering their latest movie where every pixel should be spot on.
So AFAICS this is a consumer part, probably a Volta with the FP64 stripped out and some raytrace tech added giving a slightly smaller die that GV100. Given Volta and Turing are both 12nm products, I wonder if Nvidia have done exactly the same commercial/consumer split as before they just staggered the release.
As an aside, there was a die shot of a Turing compute unit that implied a quarter of the area was for tensor cores and a quarter for RT, so they could have had twice the shaders if they cut those out and scaled up the number of CMs to fill the space. I have to wonder what that would do the things like the anti-alias performance if it could generate sample spots at twice the throughput.
Because you are thinking of old skool commerical usage tho - the commercial AI and RT stuff Nvidia does is also very dependent on other stuff outside FP64. The first cards Nvidia talked about were using Turing for commercial usage not gaming and the top bins are commercial cards.
The current large chips also make much more sense for commercial use scenarios than gaming and it means one line needs to be developed and that gamers get the rejected bins which can run at higher TDPs.
If Nvidia developed Turing with gaming in focus,not having all that die area for tensor cores and AI stuff would mean loads of normal shaders,and a much bigger performance bump for normal games.
The fact they have managed to shoehorn usage of more commerical oriented features for games,is the genius move methinks as they can re-use lower bin GPUs now in their gaming lines.
Expect a move away from FP32 focus for their gaming cards as their commercial usage areas are not so reliant on it anymore.
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
Roobubba
So it looks like 1440p is now finally going to be playable (ie 120fps+) for most titles. A while to wait for 4K it seems.
:)
But at this price? No chance.
What the hell are you playing that requires 120+ FPS to be playable?
Re: Nvidia GeForce RTX 2080 Ti and RTX 2080
Quote:
Originally Posted by
CAT-THE-FIFTH
Because you are thinking of old skool commerical usage tho - the AI and RT stuff Nvidia does is also very dependent on other stuff. The first cards Nvidia talked about using Turing for commercial usage not game and the top bins are commecial cards.
AI tensor stuff is everywhere, it is already in mass market phones. Frankly games seem to be lagging here. But for professional use, there are dedicated tensor processors which spells the end of using a GPU for those tasks. So that isn't a professional use.
I don't get the ray-tracing. I'm happy for someone to convince me that there are professionals who will lap that up, but I just can't see an example. Feel free to point me at software support that is relevant to professional users.
Now I did Google for OpenCL performance for the 2080 and found one example, the Luxmark Luxball HDR which the 2080ti most impressively monsters. That's nice for the people with that workflow, but they would have been well served by a 1080ti as well so once again I don't see that as indicating this is a professional chip.
https://www.engadget.com/2018/09/19/...080-ti-review/