Read more.Quote:
We've seen the GTS 450 card from NVIDIA. Now see what it can do in multi-GPU SLI.
Printable View
Read more.Quote:
We've seen the GTS 450 card from NVIDIA. Now see what it can do in multi-GPU SLI.
Nice read. nVidia definitely seems has the upper hand in multi-gpu setups, but generally that only brings them to parity with AMD overall.
I've used both Sli and Crossfire in the past and I won't do it again because the issues are too annoying, but for anybody who is thinking about trying it out, nVidia isn't too bad.
That aside, the 450 is poor by itself.
£200 gets you a 5850 - which qyite honestly spanks the GTS450 in sli
I agree a 450 in sli or even the rumoured dual chip 450 is a waste and just a means for Nvidea to generate profit which they desperately need. What I was hoping for and which I believe would be a killer card would be a dual chipped 460. Trouble is Nvidea would NEVER bring it out as it would leave the 480 dead in the water!
Honestly, you wouldn't even manage two stock clocked 460's on a single board, such is the high power draw. As most of the 460's being reviewed are highly overclocked parts, I think most people would be very disappointed by a dual 460, probably with an underclock like the 5970 has.
Sorry to be the pedantic one (again) but you keep calling COD:MW2, Call of Duty 2. If anything it should just be called Modern Warfare 2.
1 question - why do none of the tests have any level af AA on them? trying to do an apples to apples comparison with 5830 / gtx 460 reviews
Including 5850 would have been logical - at the price of 2 450s it sits above a 1gb GTX 460 anyway, 5850 is it's real competitor I would have though.
Hi,
The test subsystem is different on the mid-range setup than it is for the high-end cards. We use a more real-world rig that the likely performance from, say, a £600 machine. Throwing in a Core i7 980X or 965 EE, while valid enough, makes it more artificial.
There's also little point in whacking the details right up - as used on the high-end rigs - so that a Radeon HD 5670 card produces 5.2fps at 1,920x1,080. This doesn't tell us much and limits the usefulness of evaluating multi-GPU systems, insofar as any scaling may be compromised by framebuffer issues.
so a compromise within the limitations - use aa and af at lower resolutions , say 1680x1050 or so, perfectly `doable` on a midrange machine
gtx 460 eol'd?
what's it being replaced by?
full `power` GF106 core, as a good guess ;)
I have been largely a huge fan of Invidia. I built 3 pc's using Invidia chipsets, graphics and AMD cpu's and was very happy with the results.
My thinking then was their discrete gpu's should be optimised for their chipsets.
Unfortunately that is not longer the case as Intel has refused to license Invidia to produce chipsets for Sandy Bridge. Furthermore Sandy Bridge and AMD Fusion will eliminate the need for mid-range discrete graphics. The high end will still be there but on a PCI bus and support of SLI for the Intel Sandy Bridge has not been made clear.
So the only market that remains is legacy and that is only good for maybe a year for the gamer and enthusiast user. And those folks are both probably waiting to build out this spring.
One other nail in the coffin for discrete graphics lies in the AMD-Intel Agreement whereby the PCI bus may only be supported for (now) 5 more years. Intel out of the goodness of it's heart (what heart) may choose to extend this arrangement but if they do choose to discontinue PCI they are still obliged by the agreement to provide x86 compiler support for the new bus to their competition. Is Invidia their competiion now that Invidia chipsets are now a thing of the past?
So the question is: Is PCI still necessary or is it doomed in 5 years. And just what kind of future remains for discrete graphics in a Second Generation APU and graphics enabled cpu world?
It is becoming clearer that Invidia and ARM are getting cozier.