How about to save energy costs you max out your GPU and turn your central heating off ?
How about to save energy costs you max out your GPU and turn your central heating off ?
Lets hope the EU doesn't cotton onto this and cap the wattage like they did with vacum cleaners limited to 900w and bellow 80db.
To be hoonest vacums are rubish now and they cost has just risen too! Dysons are a joke.
Still to cap them I guess there has to be stock first so ...... the computer comunity should be fine.
I had a worse one: this Gigabyte HD 7950 (the most expensive card I ever bought at ~@£230 or so) had the voltage set at 1.25V (high for Tahiti as the 7970 GHz editions used that voltage) but between the review and my card they had done a switcheroo and on my mine you couldn't set the voltage anymore. AFAIR setting minus power limits also didn't work. What eventually did work was a custom BIOS. The card ran perfectly well at 1.05V and saved 20-25% power with no noticeable loss in performance. Been off Gigabyte ever since.
My advice is to find a old, broken DC-01 that is being sold as GPO or dumped. These were built like tanks and are a pleasure to work on. Ours is still going strong having had multiple repairs. Parts are still available as well and, when repaired, they work as well as when they were new. I would avoid the newer Dysons as they are made for easy assembly and are awful to open up. They have little plastic tabs on them and are so easy to break. The DC-01 has screws and just comes apart with seriously simple components that could be swapped out for any suitable component as there's no silly computer controlled electronicals.
With a little work and some consumables (like filters), those DC-01s will outlast you with ease. Even the motors are designed to be refurbished.
As for this 600W connector. I remember when the Vega64 was mocked for having a TBP of 250W. This may be acceptable for a small subset of extremely high end cards where the board partners want overclocking room and then some.
We recently got past this silly power nonsense and people were running higher end systems on 600W PSUs with plenty of room to spare. Now we're looking at needing 750W+ again? Is this because there's a desperate need to justify the extreme high end and so they're pushing the top 5% of binned parts to silly extremes? Or is it because they are wanting to use more of the chips and they're getting the lower quality chips to perform by throwing power at the problem?
I suppose you have to set power specs in firmware based on the poorest performing chips assigned to that SKU, so if you're extending from using 65% of the bell curve to 75%, you have to set all of the chips up so that extra (poorer) 10% will function reliably? And that means moar pixies.
Last edited by philehidiot; 12-10-2021 at 07:02 PM. Reason: I said that a vacuum cleaner has no electronics. I'm an idiot.
I assumed it was just trying to get decent framerates at 4K resolutions in modern games. Those ultrawide monitors are getting quite popular as well. If you can see a wider field, you have to render all the scenery, npcs etc that occupy that space and shade all the extra pixels. If you can render 1080p with 150W, well 4 times the pixels needing 600W doesn't seem impossible (though I would have thought it would scale better than that).
That's another reason I stuck with 1440p for now.
I went for 4K, knowing it'd scale perfectly into 1080P when the GPU got older.
From a productivity POV, I'm glad I did. From a "I'm not buying another GPU in this market", I'm really glad I did.
A years or so back I was regretting getting rid of the 1440P screen. Now, I'm kinda happy I can drop down to 1080P and still get decent FPS on newer games. That being said, I've only just bought Doom (2016).
There are currently 1 users browsing this thread. (0 members and 1 guests)