Read more.How does NVIDIA's Kepler perform on three screens?
Read more.How does NVIDIA's Kepler perform on three screens?
It seems most of the games cannot hit 30FPS on average with either card!
It looks like the dual GPU cards will be the only ones really capable of smooth gameplay using three monitors.
Eyefinity can be easily switched to 1, 2 or 3 screens by a simple keypress after an "add preset" which takes all of 15 seconds to set up - so yep the idle power draw is meaningless for 3 screens (although I did notice it was in the Nvidia reviewers guide). I read that it's much more fiddly with the Geforce cards, ie it has to be set manually every time. Is this true?
I don't think that's true, actually - a lot of three screen usage is for productivity purposes, during which the graphics card will, to all intents and purposes, be idle. The real question for me is whether the AMD card has a similarly high power draw when the three screens are arranged in extended desktop, or just when they're set as an eyefinity surface (i.e. an apples to apples comparison). If nvidia really do offer a significant power saving when the card is idle - i.e. for 3 screen productivity usage - that's quite a selling point to the right market.
All we need is for nv to bring that tech to the low end market (which they won't be doing in this generation, of course). A discreet card, passively cooled, with 3x HDMI and capable of driving three monitors off passive dongles, could be quite appealing...
Out of interest has anyone confirmed the following though:
http://www.techpowerup.com/162504/NV...-Detailed.html
"The new 3D Vision Surround is said to work in conjunction with Adaptive V-Sync to ensure the center display has higher frame-rate (since it's at the focus of your central vision), at the expense of the frame-rates of the two side displays (since they're mostly at your peripheral vision). This ensures there's a balanced, high-performance experience with multi-monitor gaming setups."
Yeah I read about that before. Without the marketing speel it's basically saying we cut the framerate to peripheral displays to make our card look better. I think it's a pretty bad idea for many of the games which use peripheral displays as some have more movement on them than the main display (looking out the side of a car for example). But for some games it might be OK.
edit: Thanks Hexus for including the FPS/time graphs (might help to label the X-axis as 'time')
a cf/sli bench coming for the same tests and res soon?
Last edited by HalloweenJack; 02-04-2012 at 05:32 PM.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
I'm not sure I agree here actually - unless I misunderstand you, AMD pretends there's only one big display, whereas NVIDIA uses the 'traditional' method of extending your display. So if using a program and deciding to maximise it, it'll get split over three screens - writing included - which sounds like a nightmare. Whereas still having three discrete displays allows you to use Windows 7's display management features (like Win key + right) to manage the space more effectively.There's no gaming penalty when following NVIDIA's path, but folk who want genuine three-screen real estate for 2D productivity are best served by AMD's tech, we feel.
I'm not sure how it works on the 680 but I assumed it was the same as on AMD cards. I don't quite understand Tarinder's comment about AMD being better for 2d productivity either. Maybe that needs clearing up a bit.
Anyway, on AMD cards you can run an SLS (single large surface or better known as eyefinity) which spans 2-6 screens. You can also use each screen individually by extending the displays for 2d work.
I have mine set up for ctr-shift-alt 1, 2 and 3 for 2d and ctrl-shift-alt E for eyefinity. That's all it takes, a keypress and it switches between modes.
Batman : Crysis 2 ????
Sure I haven't played that one!
Typo on page 5
"Hmm, one would expect the GTX 680 to lose more ground at 5,760x1,600." should say x1,080.
That aside it would be nice to see these benchmarks without using such high AA settings, as I'm guessing all users would drop the AA level to get a comfortable FPS.
Good review and its nice to see you guys using a linegraph as well, extremely helpful!. As others have mentioned though, did you look into this new Nvidia system? Because if it actually is in effect it could possibly be something like 60FPS for the main and 30fps set limit for any other monitor... if thats the case then that is surely why AMD had lost the edge in alot of games.
I dont like that idea at all, Nvidias idea of cutting performance from the other displays is just bad, some games you actually need 3 monitors! If you are playing FPS games then you want it all to be 100% fluent to have a nice surrounding experience, im pretty sure id notice the drop in frames as the other panels would be lagging .
If you can disable it then thats ok, and id recommend ensuring it is disabled in benches .
As above, would be good to have confirmation that the nvidia adaptive vsync was definitely off!
Interesting results too, though Id hazard a guess and say most people with 3 screen setups have the money for at least sli / crossfire?
Good to see 4k resolution will be doable with gpu's as they are now mind... if the screen industry ever gets a move on with higher res panels!
To me both cards look totally awesome, on a single 24in monitor 1920x1200 either card will work for me. my question is: would there be any noticeable difference when running an AMD or Intel CPU?
My thinking is that there shouldn't but wanted some reassurance please!
good cards but prohibitively expensive
There are currently 1 users browsing this thread. (0 members and 1 guests)