Read more.How does NVIDIA's Kepler perform on three screens?
Read more.How does NVIDIA's Kepler perform on three screens?
It seems most of the games cannot hit 30FPS on average with either card!
It looks like the dual GPU cards will be the only ones really capable of smooth gameplay using three monitors.
Eyefinity can be easily switched to 1, 2 or 3 screens by a simple keypress after an "add preset" which takes all of 15 seconds to set up - so yep the idle power draw is meaningless for 3 screens (although I did notice it was in the Nvidia reviewers guide). I read that it's much more fiddly with the Geforce cards, ie it has to be set manually every time. Is this true?
All we need is for nv to bring that tech to the low end market (which they won't be doing in this generation, of course). A discreet card, passively cooled, with 3x HDMI and capable of driving three monitors off passive dongles, could be quite appealing...
Out of interest has anyone confirmed the following though:
"The new 3D Vision Surround is said to work in conjunction with Adaptive V-Sync to ensure the center display has higher frame-rate (since it's at the focus of your central vision), at the expense of the frame-rates of the two side displays (since they're mostly at your peripheral vision). This ensures there's a balanced, high-performance experience with multi-monitor gaming setups."
Yeah I read about that before. Without the marketing speel it's basically saying we cut the framerate to peripheral displays to make our card look better. I think it's a pretty bad idea for many of the games which use peripheral displays as some have more movement on them than the main display (looking out the side of a car for example). But for some games it might be OK.
edit: Thanks Hexus for including the FPS/time graphs (might help to label the X-axis as 'time')
a cf/sli bench coming for the same tests and res soon?
Last edited by HalloweenJack; 02-04-2012 at 05:32 PM.
Main PC: Asus P8Z77 WS / 3570k @ 4.4GHz / 8GB Vengeance Black / 2x GTX 580 / Areca 1680 / X-Fi Titanium / Corsair: HX 850 / 600T / K60 / M60 / HS1A / 2x Dell 3007 / 2 x 256GB Samsung 830 (RAID0) / 2 x 128GB Kingston V100 (RAID0) / 240GB Corsair Force 3 (RAID0) / 4 x 1TB Sumsung F1 (RAID5) / Multi-boot: Win 8 x64 Pro, Win 7 x64 Ultimate, Ubuntu and OS X Lion
HTPC: GA-Z68A-D3-B3 / i5 @ 3.6GHz / 8GB XMS3 / GTX 570 / Tevii S480 / SST-LC20 / Antec TP-550 / PS50C6900 / 2 x 64GB SSD (RAID0) + 3 x 1.5TB / Win 7 x64 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB RAM / GTS 450 / Corsair 300R / Silverpower 700W modular
Server Setup: HP ML110 G5 / 8GB RAM / Areca 1210 RAID / 2 x 300GB (RAID1) / 2 x 250GB (RAID1) / 3 NICs / Windows Server 2008 R2
2 x ESX 5.1 Nodes: Asus M5A78L-M/USB3 / AMD FX 6100 / 16GB XMS3 / 500W Mushkin Volta / 160GB SATA HDD / 5 NICs
NAS 1: HP Microserver N40L / 10GB RAM / 2 x 3TB + 80GB Intel SSD (Hybrid) + 2 x 1TB / 3Gbps || NAS 2: HP Microserver N40L / 10GB RAM / 2 x 3TB (RAID1) + 2 x 640GB (RAID1) + 80GB Intel SSD (Hybrid) / 3GBps || Network: TL-WR1043ND w/DD-WRT + Dell PowerConnect 5224
I'm not sure I agree here actually - unless I misunderstand you, AMD pretends there's only one big display, whereas NVIDIA uses the 'traditional' method of extending your display. So if using a program and deciding to maximise it, it'll get split over three screens - writing included - which sounds like a nightmare. Whereas still having three discrete displays allows you to use Windows 7's display management features (like Win key + right) to manage the space more effectively.There's no gaming penalty when following NVIDIA's path, but folk who want genuine three-screen real estate for 2D productivity are best served by AMD's tech, we feel.
Anyway, on AMD cards you can run an SLS (single large surface or better known as eyefinity) which spans 2-6 screens. You can also use each screen individually by extending the displays for 2d work.
I have mine set up for ctr-shift-alt 1, 2 and 3 for 2d and ctrl-shift-alt E for eyefinity. That's all it takes, a keypress and it switches between modes.
Typo on page 5
"Hmm, one would expect the GTX 680 to lose more ground at 5,760x1,600." should say x1,080.
That aside it would be nice to see these benchmarks without using such high AA settings, as I'm guessing all users would drop the AA level to get a comfortable FPS.
Good review and its nice to see you guys using a linegraph as well, extremely helpful!. As others have mentioned though, did you look into this new Nvidia system? Because if it actually is in effect it could possibly be something like 60FPS for the main and 30fps set limit for any other monitor... if thats the case then that is surely why AMD had lost the edge in alot of games.
I dont like that idea at all, Nvidias idea of cutting performance from the other displays is just bad, some games you actually need 3 monitors! If you are playing FPS games then you want it all to be 100% fluent to have a nice surrounding experience, im pretty sure id notice the drop in frames as the other panels would be lagging .
If you can disable it then thats ok, and id recommend ensuring it is disabled in benches .
As above, would be good to have confirmation that the nvidia adaptive vsync was definitely off!
Interesting results too, though Id hazard a guess and say most people with 3 screen setups have the money for at least sli / crossfire?
Good to see 4k resolution will be doable with gpu's as they are now mind... if the screen industry ever gets a move on with higher res panels!
Gaming Rig - i7 2700k @ 5ghz w/ Dark Rock Pro, Asus Maximus IV Extreme-Z, 16gb HyperX DDR3, 1.5gb EVGA SC GTX580, 256gb Crucial M4 + 2x 2tb Samsung F4, Xonar Essence STX, Corsair AX-850w , Corsair White 600T - [24" Dell u2410] + [Acoustic Energy Aego M / Brainwavz HM5]
Media Server - HP N36L Microserver, 8gb HyperX DDR3, HP P212 /256mb SAS + 4x 2tb Samsung F4 in RAID-5, 2tb Samsung F4, 250gb Toshiba 2.5" - [WHS2011]
Vrykyl - Hexus Trust
To me both cards look totally awesome, on a single 24in monitor 1920x1200 either card will work for me. my question is: would there be any noticeable difference when running an AMD or Intel CPU?
My thinking is that there shouldn't but wanted some reassurance please!
good cards but prohibitively expensive
There are currently 1 users browsing this thread. (0 members and 1 guests)