Page 3 of 6 FirstFirst 123456 LastLast
Results 33 to 48 of 85

Thread: Nvidia GeForce RTX 2080 Ti and RTX 2080

  1. #33
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Out of curiosity and because i don't even know where to start with maths, i was wondering, roughly, how much more performance in games would have been gained if they hadn't segmented Turing?
    If all that die space they dedicated to INT32, RT 'cores', and Tensor 'cores' were replaced with a load more of the old style mixed precision CUDA 'cores'.

  2. #34
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Well my RTX2080 is now out for delivery, will be here today

    If you are hating on it, enjoy your anger and frustration while I will enjoy a nice upgrade over my existing GTX1080. Sold my 1080 for £350, so a net upgrade cost of £400 for a 40-50% boost (before DLSS is taken into account) is awesome.

    edit: just arrived in fact! Now I face a long 6 hours left at work! glad I came in early today

  3. Received thanks from:

    kalniel (20-09-2018)

  4. #35
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    The amount of criticism on Hexus is utterly tame when compared to OcUK and even the Nvidia Reddit seems to have criticism too(!).

    I started up the RTX series review thread on OcUK and it hit 10k views yesterday alone and most of the posts were criticisms with one owner trying to fight loads of posters too. TBH,this has not only been a weird launch but not seen such general negativity for a while. There was some moaning about Pascal pricing but most of it was the lack of proper stock for months since it sold out.

    Edit!!

    I can see why - first people expected Vega would come in and help drop prices but it was a flop so that didn't work out.

    Then mining came and prices jumped high.

    Then mining went down temporarily and prices dropped to launch levels.

    Then people expected the new gen would probably drop old gen prices a bit like what happened with Pascal.

    Except the new gen is so high priced relative to its launch performance that Pascal does not need to drop much.

    If you look at the deals section at the end of last year there were as good or even better deals on say a GTX1080 than now for example.

    Then RAM pricing has only dropped slightly so perhaps it's just a general sense of frustration methinks.
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 09:36 AM.

  5. #36
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    OcUKs forum being saltier than Hexus? I'm shocked

    I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!). I always find it odd that people expect major companies to sell their products at a loss or break even point .

    You are probably right that its just general frustration about the fact that these new cards are more expensive than the last generation. That's made worse by people trying to make direct comparisons which is confused by Nvidia's branding.

    There are high end cards with a high end price tag - not really "mainstream" so to speak but that's OK imo - the mainstream cards will come later at much cheaper prices.

  6. #37
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Spud1 View Post
    I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!).
    I thought margins had been continually increasing?

  7. #38
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by kalniel View Post
    I thought margins had been continually increasing?
    I would expect as a company it would do - particularly as margins on the GTX ranges will have improved and Nvidia still sell a tonne of older cards at huge margins - but I've not seen anything to say that the RTX range has an improved margin. They are at something like 30% (for the company as a whole, not on one particular range!) net atm iirc.

    We'll find out more when we get more teardowns and chip analysis of these individual cards.

  8. #39
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Spud1 View Post
    I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!). I always find it odd that people expect major companies to sell their products at a loss or break even point .
    Nvidia gross margins are now more than Intel:

    https://ycharts.com/companies/NVDA/gross_profit_margin
    https://ycharts.com/companies/INTC/gross_profit_margin

    Nvidia net margins are more than Intel:

    https://ycharts.com/companies/NVDA/profit_margin
    https://ycharts.com/companies/INTC/profit_margin

    Enthusiasts on tech forums for years were defending Nvidia's higher prices at each generation. Nvidia's net margins used to be between 10% to 20%,but are close to 40% now.

    Intel net margins used to be 15% to 20% but now are 20% to 30% now,so apparently Intel has more "reasonable" prices relative to production and R and D costs!

    R and D costs might be a consideration,except for one thing - it appears the professional/consumer line split which happened at Maxwell,where Nvidia developed two different lines,ie,when focusing on FP32 workloads(gaming cards) and that for non-FP32 workloads(commercial) has ended. Now they are starting to go back to the old way of having one single line of GPUs. This alone will help reduce R and D costs,and also the costs of chip tape-out.

    This is what one or two said here before(Corky I believe was one of them),Nvidia has found a way to shoehorn commercial features into games. Hence they will progressively drop cards with an FP32 based focus which all more recent gaming cards to this present day have become.

    Edit!!

    Also that rumour of production costs?

    Are you regurgitating Wccftech?? It came from there and was something they made up!!
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 10:56 AM.

  9. #40
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Spud1 View Post
    ....but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!).
    We do? I didn't know Nvidia have said what the build costs are, i thought it was speculation that they're costing a lot to build, speculation i personally disagree with as the RTX's are just down binned Volta's that would have ended up in landfill if they'd hadn't worked out a way to make the unique features of Volta relevant to 'gamers'.

  10. Received thanks from:

    CAT-THE-FIFTH (20-09-2018)

  11. #41
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Corky34 View Post
    We do? I didn't know Nvidia have said what the build costs are, i thought it was speculation that they're costing a lot to build, speculation i personally disagree with as the RTX's are just down binned Volta's that would have ended up in landfill if they'd hadn't worked out a way to make the unique features of Volta relevant to 'gamers'.
    From Wccftech:

    https://wccftech.com/nvidias-next-ge...ature-details/

    No link to sources so it was probably speculation by them. Remember,it was said Pascal cost a lot of money too due to the node shrink and expensive GDDR5X,etc but Nvidia margins grew too.

    Edit!!

    Quote Originally Posted by Spud1 View Post
    There are high end cards with a high end price tag - not really "mainstream" so to speak but that's OK imo - the mainstream cards will come later at much cheaper prices.
    That is the problem there. Normally a higher product does not bother me,but with graphics cards,it sets pricing at the lower end.

    The RTX2070 will come to nearly £600 - so unless Nvidia has a sudden £350 gap to the GTX2060,its going to be one of two things:
    1.)The GTX2060 moves upto closer to £400
    2.)They split the 60 series line,so a GTX2060TI,GTX2060,GTX2060SE,etc

    If the current pricing tier holds,the 60 series will eventually be shifted to the £400 mark. The $250 to $400 mark has been where the 70 series has existed for generations.

    So all the websites will nicely compare 60 series to 60 series saying its a great performance bump,but for most mainstream purchasers who are more budget locked,that £200 card to £200 card upgrade might not look as hot anymore,as the range has been spread out.

    So either stump up the extra cash for a decent upgrade or wait longer.
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 10:47 AM.

  12. Received thanks from:

    Corky34 (20-09-2018),Iota (20-09-2018)

  13. #42
    IQ: 1.42
    Join Date
    May 2007
    Location
    old trafford
    Posts
    1,340
    Thanks
    132
    Thanked
    94 times in 80 posts
    • Tunnah's system
      • Motherboard:
      • Asus somethingorother
      • CPU:
      • 3700X
      • Memory:
      • 16GB 3600
      • Storage:
      • Various SSDs, 90TB RAID6 HDDs
      • Graphics card(s):
      • 1080Ti
      • PSU:
      • Silverstone 650w
      • Case:
      • Lian-Li PC70B
      • Operating System:
      • Win10
      • Internet:
      • 40mbit Sky Fibre

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    I put the price increase down to the massive stocks of 10 series cards, not to mention them still being very viable. In generations passed, the previous cards tended to be struggling by the time the new ones come out. The 10 series is very much still an extremely potent card, and for all this talk of "finally a true 4K60 card" the 1080Ti is absolutely fine for 4K.

    They have no reason to lower the prices because the old cards are still very much worth their money, so they're treating them as current; instead of a decent performance boost for a modest price increase, as is tradition, it's more like an even higher powered 10 series card for even more money.

    That and I definitely feel like we're paying extra for tensor and RTX just so they can get it out the door and work on it in future releases, to get us used to it now.

  14. #43
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080


  15. #44
    Senior Member
    Join Date
    Jul 2009
    Location
    Oxford
    Posts
    510
    Thanks
    8
    Thanked
    45 times in 34 posts
    • Roobubba's system
      • Motherboard:
      • MSI P55 GD60
      • CPU:
      • Intel i7 860 @ 3.58GHz (Megahalems + Apache)
      • Memory:
      • 8GB Patriot Viper DDR3
      • Storage:
      • 80GB Intel X25-M + bunch of HDDs
      • Graphics card(s):
      • ATI 5870 + Earplugs.
      • PSU:
      • TAGAN 800W
      • Case:
      • Lian Li V1110 (bit disappointing)
      • Operating System:
      • Windows 7 Pro
      • Monitor(s):
      • Samsung 24" LCD (TN)
      • Internet:
      • Virgin Media 20MBit

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    So it looks like 1440p is now finally going to be playable (ie 120fps+) for most titles. A while to wait for 4K it seems.



    But at this price? No chance.

  16. #45
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by CAT-THE-FIFTH View Post
    This is what one or two said here before(Corky I believe was one of them),Nvidia has found a way to shoehorn commercial features into games. Hence they will progressively drop cards with an FP32 based focus which all more recent gaming cards to this present day have become.
    But the 2080 cards still have an fp32 focus. Volta could do fp64 at half the rate it could do fp32, like you expect from a commercial focused card. The 2080 can do fp64 at 1/32 of the fp32 rate, like you expect from a consumer card.

    Whilst I'm sure the tensor cores will be what they developed for commercial users, their inclusion does not make it a commercial card. It just means Nvidia think the feature is worth the sacrifice in silicon area over putting more shaders in. Then there is the ray tracing support; is there any support for that in commercial render engines? Something that gives an iffy quality lighting system good enough for action games isn't likely to impress the likes of Pixar in rendering their latest movie where every pixel should be spot on.

    So AFAICS this is a consumer part, probably a Volta with the FP64 stripped out and some raytrace tech added giving a slightly smaller die that GV100. Given Volta and Turing are both 12nm products, I wonder if Nvidia have done exactly the same commercial/consumer split as before they just staggered the release.


    As an aside, there was a die shot of a Turing compute unit that implied a quarter of the area was for tensor cores and a quarter for RT, so they could have had twice the shaders if they cut those out and scaled up the number of CMs to fill the space. I have to wonder what that would do the things like the anti-alias performance if it could generate sample spots at twice the throughput.

  17. #46
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by DanceswithUnix View Post
    But the 2080 cards still have an fp32 focus. Volta could do fp64 at half the rate it could do fp32, like you expect from a commercial focused card. The 2080 can do fp64 at 1/32 of the fp32 rate, like you expect from a consumer card.

    Whilst I'm sure the tensor cores will be what they developed for commercial users, their inclusion does not make it a commercial card. It just means Nvidia think the feature is worth the sacrifice in silicon area over putting more shaders in. Then there is the ray tracing support; is there any support for that in commercial render engines? Something that gives an iffy quality lighting system good enough for action games isn't likely to impress the likes of Pixar in rendering their latest movie where every pixel should be spot on.

    So AFAICS this is a consumer part, probably a Volta with the FP64 stripped out and some raytrace tech added giving a slightly smaller die that GV100. Given Volta and Turing are both 12nm products, I wonder if Nvidia have done exactly the same commercial/consumer split as before they just staggered the release.


    As an aside, there was a die shot of a Turing compute unit that implied a quarter of the area was for tensor cores and a quarter for RT, so they could have had twice the shaders if they cut those out and scaled up the number of CMs to fill the space. I have to wonder what that would do the things like the anti-alias performance if it could generate sample spots at twice the throughput.
    Because you are thinking of old skool commerical usage tho - the commercial AI and RT stuff Nvidia does is also very dependent on other stuff outside FP64. The first cards Nvidia talked about were using Turing for commercial usage not gaming and the top bins are commercial cards.

    The current large chips also make much more sense for commercial use scenarios than gaming and it means one line needs to be developed and that gamers get the rejected bins which can run at higher TDPs.

    If Nvidia developed Turing with gaming in focus,not having all that die area for tensor cores and AI stuff would mean loads of normal shaders,and a much bigger performance bump for normal games.

    The fact they have managed to shoehorn usage of more commerical oriented features for games,is the genius move methinks as they can re-use lower bin GPUs now in their gaming lines.

    Expect a move away from FP32 focus for their gaming cards as their commercial usage areas are not so reliant on it anymore.
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 12:49 PM.

  18. #47
    Missed by us all - RIP old boy spacein_vader's Avatar
    Join Date
    Sep 2014
    Location
    Darkest Northamptonshire
    Posts
    2,015
    Thanks
    184
    Thanked
    1,086 times in 410 posts
    • spacein_vader's system
      • Motherboard:
      • MSI B450 Tomahawk Max
      • CPU:
      • Ryzen 5 3600
      • Memory:
      • 2x8GB Patriot Steel DDR4 3600mhz
      • Storage:
      • 1tb Sabrent Rocket NVMe (boot), 500GB Crucial MX100, 1TB Crucial MX200
      • Graphics card(s):
      • Gigabyte Radeon RX5700 Gaming OC
      • PSU:
      • Corsair HX 520W modular
      • Case:
      • Fractal Design Meshify C
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • BenQ GW2765, Dell Ultrasharp U2412
      • Internet:
      • Zen Internet

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Roobubba View Post
    So it looks like 1440p is now finally going to be playable (ie 120fps+) for most titles. A while to wait for 4K it seems.



    But at this price? No chance.
    What the hell are you playing that requires 120+ FPS to be playable?

  19. #48
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Because you are thinking of old skool commerical usage tho - the AI and RT stuff Nvidia does is also very dependent on other stuff. The first cards Nvidia talked about using Turing for commercial usage not game and the top bins are commecial cards.
    AI tensor stuff is everywhere, it is already in mass market phones. Frankly games seem to be lagging here. But for professional use, there are dedicated tensor processors which spells the end of using a GPU for those tasks. So that isn't a professional use.

    I don't get the ray-tracing. I'm happy for someone to convince me that there are professionals who will lap that up, but I just can't see an example. Feel free to point me at software support that is relevant to professional users.

    Now I did Google for OpenCL performance for the 2080 and found one example, the Luxmark Luxball HDR which the 2080ti most impressively monsters. That's nice for the people with that workflow, but they would have been well served by a 1080ti as well so once again I don't see that as indicating this is a professional chip.

    https://www.engadget.com/2018/09/19/...080-ti-review/

Page 3 of 6 FirstFirst 123456 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •