Saracen999 (24-10-2022)
Well, good points.
To be honest, 1) is probaby more relevant to me than 2). I doubt I'm demanding enough of a user to much notice. On the other hand ... 3000 drivers are certainly much more mature than 4000. Though, I could make the same observation of 7000 v 5000 CPUs, 13th Gen v 12th or even 10th, 11th. I mean, .... ARC cards??
A lesson learned from PeterB about dignity in adversity, so Peter, In Memorium, "Onwards and Upwards".
Possibly not a deal breaker, but it's always a concern of mine as someone who tends to hand down GPUs to others in the household so they get a long use around here. We are 2 years into support of the 3000 series at this point. Nvidia have a reputation of only really bothering to optimise drivers for their latest cards, but with a 3080 you probably have enough grunt that losing a few percent would go unnoticed. The current game ready driver supports the 980 cards which came out 8 years ago so maybe they are making an effort to support cards further back these days. Still, you might have 6 years support or you might have 3 on a 3000 series.
But then from your comment near the start of this thread that you are on a Q6600 era CPU, then that would put you on around a 9800GT gpu. The integrated graphics on a 7900X is about twice as powerful as that, so maybe you can't go too far wrong
and that's with some cores disabled. The 4090 is just nuts.
Yeah, noticed that but it's not entirely clear yet why. Jayz just did a video on it, and hypothesized that. at least for the user pictures he had, it might have been that the 12-pin connector had been bent a bit too tightly, stressed the cable and so pulled in out a bit meaning it wasn't a full-body (as it were) connection. And we all know that a dodgy connection => increased risistance and if it's pulling a lot of current, that can translate into heat.
My personal feeling is that, for now, the evidence of actual card problems is a bit tenuous - it might turn out to be a real thing but, it might turn out to be some sort of user error.
While part of me says, usually while in a WTF mood, to just do it it would, in fact, be pretty silly to gp 4090. I don't have any real need for that kind of (performance) power, and the current draw is a right put-off .... as is the price. But maybe more even that that are the physical aspects of the card design, and the logistics of just getting something that humongous in the damn case.
So yeah, getting one is kind-of a whim, an impulse, bUT IT'd be stupid to indulge it. 3080 would do, and realistically a 3070 probably would too and it may come down to how much on-card memory I'm prepared to settle for, without incurring 4090-esque implications. If the (not unlaunched) 4080 was physcally available, and benchmarked, then .... maybe.
TBH< I can't een remember what the GPU in that Q6600-era system is, but I'm 95% sure it was Sapphire, which IIRC, impiies Radeon. Whatever it is, anything current is gonna blow it's socks off BUT this system is more about a desktop version of this 'ere laptop which, albeit mobile variants, is 5900X, 3080 etc so a better comparison would be with that than the Q6600-era machine, which is all but redundant. My day to day usage has shifted to this, and the data that was either on the Q6600, or my even older 'proper' server (believe it or not, Dual Celeron 550, with a hardware SCSI RAID setup) has all been moved to the 4x12TB NAS. The size of that is down to digitising just about all my media, including LPs, C90 tapes and, of course, CDs but a shedload of films and TV series on DVD too.
It's all change round here. I'm getting rid of most of the old machines entirely. I want the space back. I have some legacy hardware still that I don't want to lose (until it dies) but a fair bit got retired when I did. So, three "offices" in this house are getting trimmed to one, which will double up as a games room, tech toy room, etc .... and maybe convert the old now-empty boiler room into a (very) small server room, not just housing gardening tools!
A lesson learned from PeterB about dignity in adversity, so Peter, In Memorium, "Onwards and Upwards".
Because its quite clear what is happening. I argued with many people here about what AMD/Nvidia(not so much Intel weirdly enough) were doing during the pandemic and jacking up pricing a lot. I said it was only going to go one way.
Now things have improved,they want to increase margins even more this year so are trying to push the hardware as much as possible - basically we are seeing overclocked CPUs and dGPUs being sold as reference. This has come at the expense of greater and greater power consumption and in the case of dGPUs,Nvidia can get more business away from its own AIB partners. This is why EVGA decided to not bothered anymore. Even the motherboards are overpriced because they are designed to have to push a lot of power for the CPU.
On the lower end,Nvidia can push smaller than normal dies,overclocked to the edge. Considering the environment and the energy crisis in many countries,it really seems these companies are loosing touch with reality.
Some of the PSU expectations are getting silly just to handle the transient power peaks!
Last edited by CAT-THE-FIFTH; 25-10-2022 at 04:46 PM.
Aren't transients looking a lot lower this gen (compared to the 3000 series at least)? Thought hardware unboxed gave pretty nice scores in that respect.
Evaluating every component in isolation is ruining the industry for CPUs. There are no games that won't run well on a 5700X, or a 5800X3D if you want to push the boat out - but if you measure the CPU in isolation, a 13th gen core or 7X00 chip will get 5% more at 1080p (and 120fps) so obviously any self-respecting gamer needs to spend the extra on the new platform (add a 360mm AIO and a £500 XTREME OC board so it never throttles of course - everyone knows if your CPU doesn't turbo to the max all the time you're an idiot that doesn't know how to build a PC).
Evaluating things in isolation can be useful, but only when you also step back and contextualise the results in the complete system. That means realising that 99.99% of people don't have 4090s, and with less than a 4090 at a sensible resolution and image quality settings all half decent modern CPUs perform the same.
There are currently 1 users browsing this thread. (0 members and 1 guests)