Page 4 of 6 FirstFirst 123456 LastLast
Results 49 to 64 of 85

Thread: Nvidia GeForce RTX 2080 Ti and RTX 2080

  1. #49
    Senior Member
    Join Date
    Jul 2009
    Location
    Oxford
    Posts
    510
    Thanks
    8
    Thanked
    45 times in 34 posts
    • Roobubba's system
      • Motherboard:
      • MSI P55 GD60
      • CPU:
      • Intel i7 860 @ 3.58GHz (Megahalems + Apache)
      • Memory:
      • 8GB Patriot Viper DDR3
      • Storage:
      • 80GB Intel X25-M + bunch of HDDs
      • Graphics card(s):
      • ATI 5870 + Earplugs.
      • PSU:
      • TAGAN 800W
      • Case:
      • Lian Li V1110 (bit disappointing)
      • Operating System:
      • Windows 7 Pro
      • Monitor(s):
      • Samsung 24" LCD (TN)
      • Internet:
      • Virgin Media 20MBit

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by spacein_vader View Post
    Quote Originally Posted by Roobubba View Post
    So it looks like 1440p is now finally going to be playable (ie 120fps+) for most titles. A while to wait for 4K it seems.



    But at this price? No chance.
    What the hell are you playing that requires 120+ FPS to be playable?
    When I switched from a 60Hz screen to a 144Hz screen, the difference in playability for fast-paced shooters (I was playing Natural Selection 2 competitively at the time) was incredible. 120Hz/ steady at fps with strobing is just objectively **so much better** than 60Hz. Can't go back to that slideshow now!
    When you turn 180 degrees in a snap, you get twice as many frames during that turn - it makes it MUCH easier to spot either a fast moving enemy or a hiding enemy during the turn.

    Though I understand that Freesync/Gsync is a massive boon - not had a chance to try this out yet - so I will concede that I don't know how gaming feels at lower framerates with adaptive refresh compared with my current setup.

  2. #50
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by DanceswithUnix View Post
    AI tensor stuff is everywhere, it is already in mass market phones. Frankly games seem to be lagging here. But for professional use, there are dedicated tensor processors which spells the end of using a GPU for those tasks. So that isn't a professional use.

    I don't get the ray-tracing. I'm happy for someone to convince me that there are professionals who will lap that up, but I just can't see an example. Feel free to point me at software support that is relevant to professional users.

    Now I did Google for OpenCL performance for the 2080 and found one example, the Luxmark Luxball HDR which the 2080ti most impressively monsters. That's nice for the people with that workflow, but they would have been well served by a 1080ti as well so once again I don't see that as indicating this is a professional chip.

    https://www.engadget.com/2018/09/19/...080-ti-review/
    But that is the point though - Nvidia is pushing deep learning and ray tracing massively for commercial usage and why the top bin Turing cards are not gaming ones.

    You even contradict your own point. - if there are dedicated cards using tensor like cores then Nvidia putting a bunch of them in a graphics card shows they are serious about that market.

    If you watched the Nvidia release talks they were showing commercial usage far more than gaming,so I see them trying to get into more non gaming markets.

    They obviously are trying to break in SFX markets.
    Turing has been developed first for Nvidia non commercial markets first and they will shoehorn the use of tensor and RT dedicated hardware for gaming.

    It means less r and d on dedicated fp32 cards too. Less lines means less r and d and tape out costs.
    Nvidia for the last decade has been spending billions to break out of PC gaming with even Tegra.

    The fact is consoles are primarily based around fp32 performance and rasterisation. Even if Nvidia sells a million Turing cards this year most of the market will be on cards which are built for fp32 and rasterisation. Most games will be developed for that in mind.

    So it makes less sense for Nvidia to decide to suddenly add RT and tensor cores for gaming when only a tiny fraction of games will be using and only people on the latest stuff.

    Enthusiasts like us are niche especially when you consider it's MMOs and MOBAs which make up the lion's share of PC gaming revenue not FPS games. Many of these games have cartoony graphics which scale down better to slow cards and are more CPU limited especially as most people play at 1080p still.

    It makes far more sense for Nvidia to develope these for higher margin commercial markets first and then use the runts for us gamers.
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 02:02 PM.

  3. #51
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by DanceswithUnix View Post
    So AFAICS this is a consumer part, probably a Volta with the FP64 stripped out and some raytrace tech added giving a slightly smaller die that GV100. Given Volta and Turing are both 12nm products, I wonder if Nvidia have done exactly the same commercial/consumer split as before they just staggered the release.
    From what i can tell the RT 'cores' are using the same mixed precision 32bit wide ALU's as the rest of the silicon, whether they've divided those mixed precision 32bit wide ALU's at the hardware level (using different sized/allocation of registers, caches, and other features) or purely at the software level is another question.

    This Anandtech article talks about Tensor 'cores' in Volta but i assume some of what it talks about is transferable to how RT 'cores' are designed, if you really want to geek out the same article also links to some research conducted by Citadel LLC into the design of Volta using micobenchmarking.

  4. #52
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    https://www.nasdaq.com/article/what-...care-cm1013070

    Gaming is still growing, and live e-sports are especially on the rise as an emerging spectator event . Ray-tracing GPUs could help bridge the gap between the real and the digital, while better graphics could help attract more users of video games over time.

    The really interesting use for ray tracing is show business, though. NVIDIA says the visual effects industry generates $250 billion a year. However, those familiar with visual effects know that ray tracing has been a part of the process for a while already.

    Today, digital artists take their time creating scenes and tap the computing power of a server or "rendering farm" to create special effects and computer-generated scenes. The process is expensive (think tens of thousands of dollars, or more, for just a scene), and it can take days or weeks to complete.

    NVIDIA's work on GPUs that can handle ray tracing could be a disruptor of that industry, saving filmmakers time and money. It could also open the door for artists to become entrepreneurs in the space once again, putting the heavy-duty computational powers of a server into a compact and affordable package. For investors, though, it means NVIDIA could have found yet another new application for its GPUs. NVIDIA is on pace to easily surpass $12 billion in revenue this year. Entering the $250 billion visual effects industry could go a long way towards unlocking more growth in the near future.

  5. #53
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    People have noticed an interesting bug in the Bit-tech review.

    TAA



    DLSS



    Look at the blue car. The DLSS image looks blurrier too. A bug?

    Edit!!

    https://bit-tech.net/reviews/tech/gr...on-reviews/11/

    Hmm,not sure what they are saying now.

    In this instance, DLSS, despite being advertised as only having similar quality to TAA, was always the better method for image quality. You do have to look for the finer details, such as complicated foliage in the background, the creases of clothing, or hair, but in doing so you invariably find a better image with DLSS than with TAA (also confirmed by a few quick blind tests with nearby staff members). For example, sharpness would often be improved over TAA, which could sometimes suffer from a slight blur effect in comparison, and there were fewer artefacts on the DLSS side too. Occasionally, edges looked a touch jaggier with DLSS, but it remained the clear winner overall. There are some examples below: In each pair, the top images are captured using TAA and the bottom images with DLSS. We suggest opening them in different tabs or downloading and saving them, then zooming in to spot differences in the areas we mentioned.
    ?
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 05:33 PM.

  6. #54
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    And that's on a fixed run benchmark that i assume they've had plenty of time to work on, it raises an interesting question.

    What happens if a game is modded/patched and new assets are added? Could you fool the NN into thinking a dog is a banana like they did on BBC4's look at AI or something equally weird.

  7. #55
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Luckily someone on YT did a side by side comparison with an RTX2080:

    https://youtu.be/Y_usUAXRnGg?t=91

    I think BT mixed up the images TBH.

    The first image is with DLSS and the second image is with TAA. The TAA image has more rust on the car,and the DLSS seems to have less when the main car drives in,but seems OK when the characters get out of the car. The DLSS image has more apparent detail but worse edges. The BT review is using an RTX2080TI,so I suspect it might be some kind of bug,which is more apparent on the RTX2080TI. I suspect the RTX2080TI is pushing more FPS than an RTX2080 so its putting more demands on the tensor cores to do the reconstruction work so there is more bugs.

    Also,on OcUK someone pointed this out:

    https://youtu.be/Y_usUAXRnGg?t=21

    The face also has slightly different details too.
    Last edited by CAT-THE-FIFTH; 20-09-2018 at 05:59 PM.

  8. #56
    Senior Member
    Join Date
    Mar 2005
    Posts
    4,935
    Thanks
    171
    Thanked
    384 times in 311 posts
    • badass's system
      • Motherboard:
      • ASUS P8Z77-m pro
      • CPU:
      • Core i5 3570K
      • Memory:
      • 32GB
      • Storage:
      • 1TB Samsung 850 EVO, 2TB WD Green
      • Graphics card(s):
      • Radeon RX 580
      • PSU:
      • Corsair HX520W
      • Case:
      • Silverstone SG02-F
      • Operating System:
      • Windows 10 X64
      • Monitor(s):
      • Del U2311, LG226WTQ
      • Internet:
      • 80/20 FTTC

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Tunnah View Post
    Genuine question: historically speaking, has a new product ever charged the consumer for the performance gain over the previous product before ?

    I'm having a hard time thinking of anything. Normally the new product gives you a performance boost, for a generally acceptable price increase on the previous product. It's never "this is 40% faster, so it's 40% more expensive" (it's actually more than that in this case).

    I know we've passed on the good ol' days of getting new gen at the price of the previous model (200 to 600 were the same price! Near enough anyway) but this price increase is disgusting.
    Yes. Every time. For at least 18 years. Newer generations of Graphics cards have been creeping up in price for the top models for that long. Even taking into account inflation.

    I first noticed this when the Geforce 2 Ultra was released for £400

    It's just that internet forum wisdom for computer parts is about as good as for cars. i.e. mixed. Like the general impression that when a new range of graphics cards are released, the old ones miraculously drop in price. Actually, the price has been gradually dropping since release.

    Secondly, on the price increase being disgusting, the 2000 Series both use a more expensive process to use along with larger dies. However they are simply charging what they believe the market will bear.
    It's not a charity, nor is it essential. It's a hugely expensive hobby.
    "In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."

  9. #57
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by badass View Post
    Yes. Every time. For at least 18 years. Newer generations of Graphics cards have been creeping up in price for the top models for that long. Even taking into account inflation.

    I first noticed this when the Geforce 2 Ultra was released for £400

    It's just that internet forum wisdom for computer parts is about as good as for cars. i.e. mixed. Like the general impression that when a new range of graphics cards are released, the old ones miraculously drop in price. Actually, the price has been gradually dropping since release.

    Secondly, on the price increase being disgusting, the 2000 Series both use a more expensive process to use along with larger dies. However they are simply charging what they believe the market will bear.
    It's not a charity, nor is it essential. It's a hugely expensive hobby.
    Actually 12NM is more or less the same as 16NM,but is lower leakage:

    https://en.wikichip.org/wiki/16_nm_lithography_process

    In late 2016 TSMC announced a "12nm" process (e.g. 12FFC) which uses the similar design rules as the 16nm node but a tighter metal pitch, providing a slight density improvement. The enhanced process is said to feature lower leakage better and cost characteristics and perhaps a better name (vs. "14nm"). 12nm is expected to enter mass production in late 2017.
    TSMC lists 16NM/12NM as the same node level. If anything one could argue when Pascal was first released,16NM was somewhat less mature and more congested due to phone companies,etc wanting to make new chips on it.

  10. #58
    Registered+
    Join Date
    Nov 2015
    Posts
    29
    Thanks
    2
    Thanked
    3 times in 2 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by kalniel View Post
    Quote Originally Posted by Spud1 View Post
    I get that people don't like the price increases, but we already know that Nvidia are not making much per sale on these based on the build cost (thats before you think about the R&D costs!).
    I thought margins had been continually increasing?
    Preposterous, the margins are not increasing. If anything, they're not making money, they're loosing money. Each time you're buying one their cards, they're practically pushing money into your pockets. The more they sell, the less money they have, it's a sort of reverse sales process.

    The reason they're making more $$$ than Intel must be that they found a golden egg laying goose, that they've put hard at work, to make up for this charity of an operation. /:|

  11. Received thanks from:

    CAT-THE-FIFTH (21-09-2018),DanceswithUnix (21-09-2018)

  12. #59
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    I see plenty of people claiming margin reductions etc without any substantiation whatsoever? Speculation at WCCF doesn't count as that, as much as fanboys will latch on to any such nonsense as justification.

    12nm is not a 'more expensive node' vs 16nm - it's close to being the same thing, and given the maturity it will be significantly cheaper now than 16nm was on launch. Yes, it will cost more per die to produce vs Pascal given its much larger size, but lets not pretend it's close to being sold at a loss or Nvidia are being in any way charitable. Pascal will currently have huge margins.

    And I don't think it's anything other than childish to dismiss criticism as people 'hating on it', especially in response to people making outright false statements or nonsensical value comparisons. It's currently a poor value upgrade over Pascal, and it's only really an upgrade at all if you're going with the 2080Ti given the 2080 is ≈ 1080Ti performance (and more expensive), especially given Pascal seems to overclock more readily.

    As I said originally, my main surprise is how meh it looks in reviews, I was expecting at least a few half-decent tech demos to sell the price increase to the more gullible of the beta-testers!

    Any speculation about the ray tracing stuff is just that at this point in time - speculation. If you're happy to take a ticket for that, fill your boots. I suspect most will wait until they know what they're actually buying.
    Last edited by watercooled; 20-09-2018 at 11:53 PM.

  13. Received thanks from:

    CAT-THE-FIFTH (21-09-2018)

  14. #60
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by watercooled View Post
    I see plenty of people claiming margin reductions etc without any substantiation whatsoever? Speculation at WCCF doesn't count as that, as much as fanboys will latch on to any such nonsense as justification.

    12nm is not a 'more expensive node' vs 16nm - it's close to being the same thing, and given the maturity it will be significantly cheaper now than 16nm was on launch. Yes, it will cost more per die to produce vs Pascal given its much larger size, but lets not pretend it's close to being sold at a loss or Nvidia are being in any way charitable. Pascal will currently have huge margins.

    And I don't think it's anything other than childish to dismiss criticism as people 'hating on it', especially in response to people making outright false statements or nonsensical value comparisons. It's currently a poor value upgrade over Pascal, and it's only really an upgrade at all if you're going with the 2080Ti given the 2080 is ≈ 1080Ti performance (and more expensive), especially given Pascal seems to overclock more readily.

    As I said originally, my main surprise is how meh it looks in reviews, I was expecting at least a few half-decent tech demos to sell the price increase to the more gullible of the beta-testers!

    Any speculation about the ray tracing stuff is just that at this point in time - speculation. If you're happy to take a ticket for that, fill your boots. I suspect most will wait until they know what they're actually buying.
    You better not watch the DF review,its expensive but...but...but phones cost more and it must cost more to make,but OFC ignoring Nvidia making 40% net margins(not gross) and making Intel look a bit lame in that regard. Also it ignores the fact some people use a phone as their primary device for all their computing,gaming and photographic needs too,whereas a graphics card is somewhat more niche. Parts of the tech press for too long has just justified the price increases at every generation for lots of tech,and seem to be more concerned about the welfare of companies(not so much the consumers) so that more and more less well read people think its "justified" just like with phones,or microtransactions are normal,etc.

    Sadly we might as well get used to it,as it is going to get worse methinks especially with the fact credit is still easy to get. The best thing is to vote with your wallet. No doubt people will still throw money at these things,and as time progresses companies will push prices higher and higher and people will start pricing themselves out of getting new cards and so on. At some point,a time will come when things get pushed too far and the money tree ends.
    Last edited by CAT-THE-FIFTH; 21-09-2018 at 01:52 AM.

  15. #61
    IQ: 1.42
    Join Date
    May 2007
    Location
    old trafford
    Posts
    1,340
    Thanks
    132
    Thanked
    94 times in 80 posts
    • Tunnah's system
      • Motherboard:
      • Asus somethingorother
      • CPU:
      • 3700X
      • Memory:
      • 16GB 3600
      • Storage:
      • Various SSDs, 90TB RAID6 HDDs
      • Graphics card(s):
      • 1080Ti
      • PSU:
      • Silverstone 650w
      • Case:
      • Lian-Li PC70B
      • Operating System:
      • Win10
      • Internet:
      • 40mbit Sky Fibre

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by CAT-THE-FIFTH View Post
    You better not watch the DF review,its expensive but...but...but phones cost more and it must cost more to make,but OFC ignoring Nvidia making 40% net margins(not gross) and making Intel look a bit lame in that regard. Also it ignores the fact some people use a phone as their primary device for all their computing,gaming and photographic needs too,whereas a graphics card is somewhat more niche. Parts of the tech press for too long has just justified the price increases at every generation for lots of tech,and seem to be more concerned about the welfare of companies(not so much the consumers) so that more and more less well read people think its "justified" just like with phones,or microtransactions are normal,etc.

    Sadly we might as well get used to it,as it is going to get worse methinks especially with the fact credit is still easy to get. The best thing is to vote with your wallet. No doubt people will still throw money at these things,and as time progresses companies will push prices higher and higher and people will start pricing themselves out of getting new cards and so on. At some point,a time will come when things get pushed too far and the money tree ends.
    My favourite thing is that people believe a company would reduce their margins to give the consumer more of a bargain lol.

    If it cost more, you'd be paying more, that's a fact. And if it ACTUALLY cost more, you'd know about it due to the millions of press releases, tech talking points, and straight up propaganda videos explaining why they have to charge you so much.

  16. Received thanks from:

    CAT-THE-FIFTH (21-09-2018)

  17. #62
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Tunnah View Post
    My favourite thing is that people believe a company would reduce their margins to give the consumer more of a bargain lol.

    If it cost more, you'd be paying more, that's a fact. And if it ACTUALLY cost more, you'd know about it due to the millions of press releases, tech talking points, and straight up propaganda videos explaining why they have to charge you so much.
    It's a load of bollocks if you look at the same excuses made years ago on tech forums and the tech press. Companies seed these things out to increase margins,that costs are high, etc they need to go up since margins will be even lower.

    I argued with so many internet experts who told me I was wrong and I said margins would go up and so would revenue,to record levels.

    Its like if the cost of flour went up 10% and the baker charged your 40% more for a loaf of bread - little white lies can be helpful.

    I was right over 5 years ago and I am still right now.

    Unless you work for a company or financially benefit from them, a consumer should not give a flying fig about caring about maintaining margins outside a company staying afloat and being profitable.

    But in the realworld people are not even that charitable.

    Nobody on the real world cares how much margins Tesco makes as long as they can afford to keep food on the table.

    Nobody cares about all the margins of high street retailers when buying off the internet, This has meant so many jobs are being lost,despite shops probably making worse margins as they have to employ more people,pay for shops,pay more taxes,rates,etc and internet retailers like Amazon using legal loopholes to avoid more tax,etc. A shop cannot compare in price due to the higher fixed costs.

    Most consumers in the real world care less about companies and more about themselves.
    Last edited by CAT-THE-FIFTH; 21-09-2018 at 03:02 AM.

  18. Received thanks from:

    Strawb77 (21-09-2018)

  19. #63
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by Corky34 View Post
    This Anandtech article talks about Tensor 'cores' in Volta but i assume some of what it talks about is transferable to how RT 'cores' are designed, if you really want to geek out the same article also links to some research conducted by Citadel LLC into the design of Volta using micobenchmarking.
    Thanks, I'll have to dig that out.

    That article makes it sound like the Tensor cores are a dedicated function, which makes some sense. AI isn't that sensitive to calculation precision, but data sets can be huge and some calculations are bandwidth limited making fp16 twice as fast for a given bandwidth and potentially faster if you avoid paging training data on and off the card because your vram can store twice as much compared with fp32. So tensor operations really want fp16; but frankly nothing else does. Adding fp16 support to the fp32 and fp64 cores makes them more complex and so potentially slows them down when nothing else uses fp16. I can see reasoning for this being a dedicated unit.

    The AI anti-aliasing implies the tensor arithmetic isn't stomping on the shader performance too much, being outside the main fp32 pipe would probably help that.

    The bit I don't get is the rumours that lower end cards won't have AI or RT support. A mid range card with half the bandwidth of a 2080 will have half the shaders serving a lower resolution with half the pixels, so RT support can be scaled in line with that. AI is a little more tricky, it isn't so useful to have half a brain (I've worked with people like that ) but if AI is designed to run on a cpu you could still have the opportunity to offload that.

  20. #64
    Senior Member
    Join Date
    Mar 2005
    Posts
    4,935
    Thanks
    171
    Thanked
    384 times in 311 posts
    • badass's system
      • Motherboard:
      • ASUS P8Z77-m pro
      • CPU:
      • Core i5 3570K
      • Memory:
      • 32GB
      • Storage:
      • 1TB Samsung 850 EVO, 2TB WD Green
      • Graphics card(s):
      • Radeon RX 580
      • PSU:
      • Corsair HX520W
      • Case:
      • Silverstone SG02-F
      • Operating System:
      • Windows 10 X64
      • Monitor(s):
      • Del U2311, LG226WTQ
      • Internet:
      • 80/20 FTTC

    Re: Nvidia GeForce RTX 2080 Ti and RTX 2080

    Quote Originally Posted by watercooled View Post
    I see plenty of people claiming margin reductions etc without any substantiation whatsoever? Speculation at WCCF doesn't count as that, as much as fanboys will latch on to any such nonsense as justification.
    Agreed. Nvidias margins are clearly increasing. A simple look at their financial statements proves that.
    12nm is not a 'more expensive node' vs 16nm - it's close to being the same thing, and given the maturity it will be significantly cheaper now than 16nm was on launch. Yes, it will cost more per die to produce vs Pascal given its much larger size, but lets not pretend it's close to being sold at a loss or Nvidia are being in any way charitable. Pascal will currently have huge margins.
    Not sure I agree with that - simply because I think its bad business to sell capacity on a higher performance, lower power, lower area per transistor process for the same price. However, lets just say perhaps it does
    And I don't think it's anything other than childish to dismiss criticism as people 'hating on it', especially in response to people making outright false statements or nonsensical value comparisons. It's currently a poor value upgrade over Pascal, and it's only really an upgrade at all if you're going with the 2080Ti given the 2080 is ≈ 1080Ti performance (and more expensive), especially given Pascal seems to overclock more readily.
    Remember value is an personal thing. If a highly paid individual can pay a few hundred more for a device that saves 40% of their time, then it is exceptional value to them.
    To me, personally I see any card costing over £300 as obscenely expensive so I regard the RTX 2000 series as exceptionally poor value. But I can see how someone with lots of money and nothing better to spend it on may see the substantial performance jump for a similarly substantial increase in cost as good value. Particularly if they just must have the current thing that produces the longest bars in a load of bar graphs.

    As I said originally, my main surprise is how meh it looks in reviews, I was expecting at least a few half-decent tech demos to sell the price increase to the more gullible of the beta-testers!

    Any speculation about the ray tracing stuff is just that at this point in time - speculation. If you're happy to take a ticket for that, fill your boots. I suspect most will wait until they know what they're actually buying.
    However I have seen others call Nvidias prices for these cards "disgusting" and other such assertions. In reality it's just good business. The 2000 Series are more costly to produce (looking at die size alone, notwithstanding possible increases in cost of process). They have created a competitive moat by having the fastest cards money can buy so they get to enjoy fat margins. If the market will bear increased prices (which it clearly will) they would be doing a disservice to their shareholders by not charging what they can.

    Gamers whinging about this need to learn that this is the real world. Imagined pricing curves that suit their worldview never happened. The only way for Nvidias margins to take a hit is for demand to be reduced. Either through competition or simply reduced demand for the performance on offer at the price they charge.

    If Vega has held records for performance on release rather than trading blows with the GTX 1080, does anyone think the price would have been the same? Or more?
    "In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."

  21. Received thanks from:

    watercooled (21-09-2018)

Page 4 of 6 FirstFirst 123456 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •