Page 2 of 2 FirstFirst 12
Results 17 to 29 of 29

Thread: Nvidia GTC 2022 Livestream

  1. #17
    ALT0153™ Rob_B's Avatar
    Join Date
    Jul 2006
    Posts
    6,370
    Thanks
    395
    Thanked
    782 times in 540 posts
    • Rob_B's system
      • Motherboard:
      • Biostar X370GTN
      • CPU:
      • 5600X
      • Memory:
      • 16GB @ 3200 (16-15-15)
      • Storage:
      • 1TB WD SN550
      • Graphics card(s):
      • MSI 1070 Armor OC
      • PSU:
      • Fractal Design ION SFX-L 650W
      • Case:
      • NZXT H1 v1

    Re: Nvidia GTC 2022 Livestream

    I'm still hopeful prices will fall on older models (at least a bit) but I'm prepared to wait or buy 2nd hand (thinking something like a 2070 Super would do me fine)

    To be fair I'm likely not nvidias target demographic though so I'm sure they really don't care what I think!

  2. #18
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    30,924
    Thanks
    1,842
    Thanked
    3,352 times in 2,694 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia GTC 2022 Livestream

    I haven't watched the stream yet, but very disappointed at the 4080 model ambiguity. It doesn't do card users any good, it makes recommending specifications for games much more complex, and it looks like they took what was going to be a 4070 and rebranded it to attain a higher price point, as if model number alone was the sole justification for pricing.

    Did they confirm FE for all three? Rumour had it the 407080 12GB was not getting a FE.

    Save us AMD, you're our only hope...

  3. Received thanks from:

    AGTDenton (23-09-2022),Iota (24-09-2022)

  4. #19
    RIP Peterb ik9000's Avatar
    Join Date
    Nov 2009
    Posts
    7,531
    Thanks
    1,767
    Thanked
    1,340 times in 1,003 posts
    • ik9000's system
      • Motherboard:
      • Asus P7H55-M/USB3
      • CPU:
      • i7-870, Prolimatech Megahalems, 2x Akasa Apache 120mm
      • Memory:
      • 4x4GB Corsair Vengeance 2133 11-11-11-27
      • Storage:
      • 2x256GB Samsung 840-Pro, 1TB Seagate 7200.12, 1TB Seagate ES.2
      • Graphics card(s):
      • Gigabyte GTX 460 1GB SuperOverClocked
      • PSU:
      • NZXT Hale 90 750w
      • Case:
      • BitFenix Survivor + Bitfenix spectre LED fans, LG BluRay R/W optical drive
      • Operating System:
      • Windows 7 Professional
      • Monitor(s):
      • Dell U2414h, U2311h 1920x1080
      • Internet:
      • 200Mb/s Fibre and 4G wifi

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by kalniel View Post
    I haven't watched the stream yet, but very disappointed at the 4080 model ambiguity. It doesn't do card users any good, it makes recommending specifications for games much more complex, and it looks like they took what was going to be a 4070 and rebranded it to attain a higher price point, as if model number alone was the sole justification for pricing.

    Did they confirm FE for all three? Rumour had it the 407080 12GB was not getting a FE.

    Save us AMD, you're our only hope...
    I don't think AMD will rescue the situation. Have you seen the prices being put forward for x670 motherboards?

  5. #20
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    30,924
    Thanks
    1,842
    Thanked
    3,352 times in 2,694 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by ik9000 View Post
    I don't think AMD will rescue the situation. Have you seen the prices being put forward for x670 motherboards?
    I haven't, doesn't sound good

  6. #21
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,617
    Thanks
    741
    Thanked
    1,480 times in 1,248 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 1TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 35 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by ik9000 View Post
    I don't think AMD will rescue the situation. Have you seen the prices being put forward for x670 motherboards?
    Tom on a Moore's Law is Dead video seemed to think that the AMD cards would be way cheaper to make than Nvidia's cards, so there is some hope.

    In fact, if Nvidia is sat on a mountain of 3060 and 3070 silicon and there is nothing similar at AMD, then they might be able to come in earlier and cheaper than Nvidia which would be good for us consumers.

    Tom did say the reason the 3070 was renamed the 3080 12GB is because the prices to manufacture were out of control.

  7. #22
    Senior Member
    Join Date
    Aug 2016
    Posts
    3,247
    Thanks
    811
    Thanked
    784 times in 577 posts

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by ik9000 View Post
    Oh re the 4070 we do know its date and pricing. November £949. They've called it the 4080 12gb for some reason despite it having a different chip and bandwidth etc. That reason one might speculate is to try and justify charging nearly £1000 for a xx70 model that has traditionally been the "affordable but still high end" sweet spot, and is now laughably not affordable. And this at a time of economic uncertainty and belt tightening. Greedy clowns.
    While I take the point you're making, and yeah, I can see the "really a 4070" logic in it, I do wonder.

    Is it misleading? In my opinion, oh yeah. Kinda. I mean the CUDA core count etc .... but I wonder what percentage of buyers will firstly notice, and secondly, care?

    Wait, let me explain what i mean. I'm fairly technically minded and still a bit wooly on which cores do what. I also absolutely do not care, not one jot, what they call the card. 4080 12GB, 4070, 4070Ti .... helll, they can call it my Aunt Sally for all I care but I do care about what performance I get and what it costs me. That, really, is about all I use the model numbers for - a broad guide to that.

    And we still have a very incomplete picture of overall performance. Okay, so RT performance looks stunning but .... to what extent do I care about RT? My answer (YMMV) is that it's a factor for sure, and maybe starting to come into it's own but it's really not my main focus. And given that there's so much we don't yet know about spec, and zero independent benchmarking, the jury is out for me on whether calling it a 4080 12GB or 4070 is fairer, because for me, that will depend on relative performance.

    All things said, though, the naming does indeed look cynical and misleading, so far. If I had to guess, I'd say they're sitting on a large pile of 3070Ti and below, and we aren't going to see 40xx cards below the 4080 12GB until they aren't still sitting on it.

    It may very well be that, a year from now, a 4070 looks like a better match for my needs and inclination to size of budget, and that a 3070 or 3080 will look like a bad idea, with 20:20 retrospect. But, dammit, we won't know 'til we see the spec and/or 3rd party benchmarks. But I do note that base and boost clocks even on 408012GB compared to 300/3090 etc have me asking myself questions.

    On the one hand, a 3060 would probably give me what I need (but not what I want). On the other hand, I'm tempted to just say "aw, hell" and get a 4090 when they ship. And up the rating on the PSU in my build spec. It's only money. Who really needs it?

    Given that I keep saying my intention is to build this next be PC and for it to be the last time I ever do it, then aside from cost, and power draw, the 4090 does make some sense in that it gives me the maximum likely length of time before the system starts to creak at the seams.

    So, back to your quote .... I do get what you mean by the 4080 12GB being the 4070 but, it really isn't, if looked at in terms of what peformance we get. We don't (yet) know quite what that'll actually look like for 4080 (either of them) or 4090, but will for the latter pretty sooner. It could well be that it'll be months before we know for an actual 4070, so we don't know what an actual 4070 will look like.

    It is really starting to make me reconsder ruling out AMD GPUs.

    It's bleeping annoying - now that a variety of card choice actually exists and is sitting on retail shelves for the first time in yonks, they all moving the bleeping goal posts. And I thought I was confused before? At least it didn't then much matter because virtually anything was impossible to buy at a half-sensible price.
    A lesson learned from PeterB about dignity in adversity, so Peter, In Memorium, "Onwards and Upwards".

  8. #23
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    30,924
    Thanks
    1,842
    Thanked
    3,352 times in 2,694 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia GTC 2022 Livestream

    From the specs, it looks like the 4080 16GB will be in a different league to the 4080 12GB. Something like up to 25% I think I saw. For a card with the same model designation that's huge. Especially if they release a 4070 with performance far closer to the 4080 12GB than the 4080 is to the, er, other 4080.

    Of course, now they could really cripple the 4070 (160bit mem and 8x PCI-lanes might go someway to achieving that) and tell us that's what we asked for

    Hence my hope for AMD to deliver a relatively efficient, non-crippled card that's actually value for money/watt. But probably only if you're paying in dollars.

    Hopefully, AMD will go after the 3000 series residuals which Nvidia are deliberately tiering above.
    Last edited by kalniel; 22-09-2022 at 08:43 PM.

  9. #24
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,617
    Thanks
    741
    Thanked
    1,480 times in 1,248 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 1TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 35 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by Saracen999 View Post
    While I take the point you're making, and yeah, I can see the "really a 4070" logic in it, I do wonder.

    Is it misleading? In my opinion, oh yeah. Kinda. I mean the CUDA core count etc .... but I wonder what percentage of buyers will firstly notice, and secondly, care?
    If you go back and look at a review of the GTX 1070, it was about 85% of the performance of a GTX 1080 for 75% of the price (which was about £400 vs £600, and back then we thought that was expensive). Well the 12GB 4080 looks to be 75% of the price of the 4080 16GB. Performance? Well the 12GB model has 79% of the shaders of the 16GB model which is a bit low, but then the 12GB model has slightly higher clocks which with my crude calculation gets you back to about 83% of the performance.

    So yeah, to me the 4080 12GB is a re-label of the 4070. The Moore's Law is Dead explanation that Nvidia didn't bother engineering for cost because recently they haven't had to would explain that. To me that sounds like they messed up, but every time I've thought that Nvidia seem to sell their cards like hot cakes so I'm biting my lip this time. I just know that they won't be selling to me personally at that price.

    PC Gamer are somewhat less enthusiastic, and place it more in line with a 4060ti vs the 4090:

    https://www.pcgamer.com/nvidia-rtx-40-series-let-down/

  10. Received thanks from:

    kalniel (23-09-2022)

  11. #25
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    30,924
    Thanks
    1,842
    Thanked
    3,352 times in 2,694 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by DanceswithUnix View Post
    PC Gamer are somewhat less enthusiastic, and place it more in line with a 4060ti vs the 4090:

    https://www.pcgamer.com/nvidia-rtx-40-series-let-down/
    Smart article. They're highlighting the mem bus in particular - I remember being concerned my 1060 only had a 192bit bus and now they're putting a 4080 on the same..

    I hope AMD have bamboozled us all, though as that article says, Nvidia seem pretty confident they're safe.

  12. #26
    Senior Member
    Join Date
    Aug 2016
    Posts
    3,247
    Thanks
    811
    Thanked
    784 times in 577 posts

    Re: Nvidia GTC 2022 Livestream

    My point was that most of us here will be looking at spec;s and benchmarks to decide, and either won't put much weight behind product number, or are already aware of the (potential) differences. But a lot of gamers will care only about gaming performance, and relative performance in relation to relative cost.

    Until we see actual cards, including 4070 etc, we can't know what that is.

    For those that will look at spec's etc, the model name is pretty meaningless beyond simple marketing, which is why I said they can call it Aunt Sally if they want. I don't care what the model number is.

    There is nothing written anywhere that says because the difference in 10xx was xx% between two models that 20xx has to be the same percentage, or that the price differential needs to be the same. Nor, between 20xx and 30xx or, of course, between 30xx and 40xx. Most (or all) here are obviously going to look at what a, say, 3080 costs compared to a 4070, and the relative perfrmance and compare the two. And for a while, it's relevant. I'm certainly going to be doing it re: price and performance of 40xx cards .... when we see what it really is. But as of this moment, all we have to go on, which while certainly suggestive is not definitive, is little more than manufacturer-provided ray-tracing performance and what we might infer from clock speeds etc, not real-world game performance, independently tested.

    Being a cynic, it would not surprise me if a company trying to sell product (or build anticipation in advance of release) was to cherry pick what they choose to emphasaise. It is, after all, marketing. And as we all know full well, how well a given card performs depends on a wide range of factrs, and is considerably more nuanced that simple graphs suggest. For a start, which game? What does that game really demand - clock speed, loads of memory, RT grunt, is it CPU or GPU bound? And so on.

    Model numbers are pretty meaningless, except in a very broad sense and comparing generations is especially so. nVidia are no doubt trying to position cards in a way that'll make sense a couple of years from now, bearing in mind what their plans for releases not yet announced are. Will there be a 4070Ti or not? Maybe they've decided to not do that again. Or not. And if they do, what's the 4070 to 4070Ti perforance and price differential? And so on.

    It's certainly valid for buyers right now, and apparently if speculation is correct for seveal months to come, where a given 30xx card overlaps 40xx in both price and performance, but as stocks of 30xx disappear, and they will, it'll become less and less relevant, and nVidia have to plan for that, rather more than for 30xx to 40xx comparisons. They are, after all, different generations.

    We don't know anywhere near enough, yet, to really know what future cards there'll be, how they compare or at what price differences. We don't even know what models there will be. We can guess, and those guesses may end up being right, but they are guesses.

    What I know is that any current or future buyers will have to compare what's available when they buy, and as with most things and pretty much all tech stuff, what each step up the model range gets you, and what it costs you. That will depend as much on the buyer's budget and what they're trying to achieve as anything, and certainly more than somewhat artificial partial performance figures, especially inferred ones from incomplete official specs.

    That's my point. Not that the comparison to a potential 4070 is wrong, but that we don't know yet what the future extent of the range will and won't look like, or even how nVida will choose to segment their range.

    What I don't like (and I mean "don't like" in big, neon flashing letters) is the cynicism implied by highlighting the RAM on the twoo current 4080 models, and kinda omitting to mention (on packaging) "oh yeah, significantly curtailed hardware too". It's misleading at best, and arguably deceitful.

    BUT .... what marketing isn't?

    We might reasonably expect a much fuller hardware spec to be right there, on the box, just like the 12GB and 16GB are, or the inference many will draw, not least 'cos of 8GB v 10GB cards in current gen, is that that is the only difference. And it seems it isn't. But wat matters to those that aren't looking at clock speeds, coe counts, DLSS version and so on, is merely how it performsin their games, and what the cost difference is. Given that we don't yet know actual gaming performance, let alone in different types of games, we can't judge whether those two cards are better classified as 4080 variants, or whether 4070 is more apt because what was used last (current 30xx) isn't necessarily the best yardstick for next gen. It may be, for instance, that 3070 (or Ti) and lower, cards stay available for much longer than we or reviewers expect, and that 40xx are really aimed at those looking for ray-tracing. If something like that happens, it'll moved the product-range goalposts and different pricing may be applied. Maybe, RT will become much more of a "must have" as (and if) future support in gaming goes up, and we'll see a GPU market fragmentation between relatively expensive RT (40xx) cards, and more raster-oriented but economical 30xx-type cards. Or not. My point is we just don't yet know what nVidia have in mind, or if they're doing something that changes tthe way they want to spread their product range. I'm not predicting they are doing what I've suggested - frankly it's unlikely. We just don't know what they have in mind, or why, which is why I don't think the 4070 name matters much YET. When (assuming they do) release one, then is the time for deciding on whether it akes sense because, bythen, the yardstick may well be different to what was right for 30xx and before.


    Just to be clear - I'm not any sort og nVidia fan. The way Gigabyte behaved over 'exploding' PSUs has them on my blacklist, but .... there's a range of alternatives for my mobo, and I'd already settled om Seasonic for PSU. That 12GB and 16GB move is cynical enough that the ONLY reason I'm still even considering nVidia GPUs is the very restricted range of alternatives and even then, I'm back thinking about AMD options, which I wasn't, prior to this farce.
    A lesson learned from PeterB about dignity in adversity, so Peter, In Memorium, "Onwards and Upwards".

  13. #27
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,617
    Thanks
    741
    Thanked
    1,480 times in 1,248 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 3700X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 1TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 35 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 80Mb/20Mb VDSL

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by kalniel View Post
    Smart article. They're highlighting the mem bus in particular - I remember being concerned my 1060 only had a 192bit bus and now they're putting a 4080 on the same..

    I hope AMD have bamboozled us all, though as that article says, Nvidia seem pretty confident they're safe.
    If Nvidia have followed AMD's recent lead an stuck a lot of cache on there, then the 192 bit bus may well not be a problem.

    AMD's next gen seem like a complete wildcard right now. It sounds like they have gone for something like chiplets, but stacked. A whole new way of packaging is an aggressive but high risk way forward, but if it goes well then that could be one heck of a win for AMD, with similar cost benefits to CPU chiplets of improved yield rates and having a memory controller on an older & cheaper process to the expensive compute units. Fingers crossed they got it right eh?

  14. #28
    Senior Member AGTDenton's Avatar
    Join Date
    Jun 2009
    Location
    Bracknell
    Posts
    2,283
    Thanks
    724
    Thanked
    616 times in 417 posts
    • AGTDenton's system
      • Motherboard:
      • ASUS P6T7 WS Supercomputer
      • CPU:
      • Intel Core i7 980
      • Memory:
      • 24GB Corsair Dominator GT
      • Storage:
      • Samsung 860 Pro + HDDs
      • Graphics card(s):
      • Asus 1030
      • PSU:
      • Seasonic X-850W
      • Case:
      • Fractal Design R3
      • Operating System:
      • 10 Pro x64
      • Internet:
      • 70MB using BT line

    Re: Nvidia GTC 2022 Livestream

    This was quite a good article: AMD and intel, if they sort themselves out, do not need to compete with the 4090

    https://www.digitaltrends.com/comput...ertake-nvidia/

  15. #29
    Senior Member
    Join Date
    Aug 2016
    Posts
    3,247
    Thanks
    811
    Thanked
    784 times in 577 posts

    Re: Nvidia GTC 2022 Livestream

    Quote Originally Posted by DanceswithUnix View Post
    If Nvidia have followed AMD's recent lead an stuck a lot of cache on there, then the 192 bit bus may well not be a problem.

    AMD's next gen seem like a complete wildcard right now. It sounds like they have gone for something like chiplets, but stacked. A whole new way of packaging is an aggressive but high risk way forward, but if it goes well then that could be one heck of a win for AMD, with similar cost benefits to CPU chiplets of improved yield rates and having a memory controller on an older & cheaper process to the expensive compute units. Fingers crossed they got it right eh?
    It could indeed, and yup, fingers crossed.

    If anything they come up with is a strong competitor to team Green, it'll at least go some way to keeping Green 'honest', and maybe they can do to Green what Ryzen did to Intel. The competition needs to be in more than just hardware though. And nVidia seem to now have AV1. But even on 30xx, I had about half of one eye on video encoding - it isn't just a gaming card for me.

    That said, even the mobile versions of 5900X and 3080 on this laptop blow my previous machine clear out of the water and, truth be known, out of the solar system too. Not that my old machine was slow, but I suspect one of the two hamsters on the wheel may have croaked.
    A lesson learned from PeterB about dignity in adversity, so Peter, In Memorium, "Onwards and Upwards".

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •