I wish they would benchmark with 1680x1050 seeing as alot of gamers now use widescreen monitors.
I wish they would benchmark with 1680x1050 seeing as alot of gamers now use widescreen monitors.
"Reality is what it is, not what you want it to be." Frank Zappa. ----------- "The invisible and the non-existent look very much alike." Huang Po.----------- "A drowsy line of wasted time bathes my open mind", - Ride.
Intersting thing on that 2nd link
1024x768
GeForce 8600 GT 256MB 40
GeForce 7900 GS 256MB 38
8600GT beating a 7900gs
Not entirely unexpected in that case. The 8600 does have superior shader hardware. Generally, I would still consider the 7900GS a better buy.
Bioshock's activation non-sense and AA troubles are major turn offs for me.
I looked into this and found that those pics are wrong. If you have DX10 hardware and run BioShock in DX9 with DX10 detail surfaces off, you still get those water ripples - running on XP or Vista with DX9 hardware actually looks like:
Compared to:
Tweak guides sums it up:
"The second set of screenshots above highlights the difference in the details between DX9 and DX10 for water effects. The three screenshots all show water ripples from the character walking backwards through a puddle of water. The XP DX9 image shows generic 'splash marks' in the wake, while both Vista DX9 and Vista DX10 show accurate DX10 ripple physics in action.
Importantly, this highlights an oddity which currently occurs with BioShock: it appears that if your graphics card is DX10-capable, then BioShock uses full DX10 effects even when you have the 'DirectX 10 Detail Surfaces' set to Off. This explains why Antialiasing is not possible in Vista DX9 mode, while it is in XP DX9 mode. It also explains why performance comparisons between Vista DX9 and DX10 mode which demonstrate that DX10 has minimal image quality or performance impact (e.g. in this article) are not entirely correct, because under Vista BioShock will always run in DX10 mode if it detects a DX10 graphics card."
-Copyright © 2007 Koroush Ghazi
Also, I wouldn't feel sorry for 2900 XT users just yet djfluff, the Vista driver just sucks for BioShock it seems, hopefully fixed soon (like, before Crysis ^^). Here are some results from XP:
And XP compared to Vista DX9 and Vista DX10:
In my eyes it looks like the 2900XT is finally showing its power - hopefully they can fix the drivers on Vista before we get some more DX10 games such as World in Conflict and Crysis.
Last edited by Nemz0r; 29-08-2007 at 08:57 PM.
kalniel (03-09-2007)
http://www.bit-tech.net/gaming/2007/..._performance/1
Bit tech have their benchmark article up now as well.
Very interesting.
I don't understand why ripples caused by the PC are dx10 only? Morrowind managed this with PS 1.3 or so.
Maybe it would cause more of a performance hit without using shader 4.0. Add realtime reflection and multiple source dynamic light / shadowing etc to the ripple effect and we're getting pretty GPU intensive.
Cool article! from my own personal experience after just upgrading my system today to quad core from amd opteron 185 I have noticed the following:
A 320MB 8800GTS at stock gfx card is not enough for Bioshock even with a quad core cpu it still dips to the 30s in heavy shader places (eg: in the Frolics main hall area) - overclocking the gfx card to 600/900 provides a major boost though and this boost is boosted even further with a beefy cpu - the two just love working together...fast!
With the card at 600/900 I get an average of 50-70fps constant at 1680x1050 highest details in DX10 mode. With card at stock I get an average of about 40-60 whilst its 30-40 in the shader heavy areas.
If my card was a 640MB I reckon it would not dip below 40fps even on stock clocks. I can only imagine how a 9800 series card will play DX10 games if they are indeed 2x more powerful than a 8800Ultra as the rumours suggest!
I my above experience I would say that even a Socket939 dual core AMD would easily manage to pull 40fps minimum at 1600x1200 in all areas regardless of what is happening as long as the GFX card is beefy (as in 8800gtx or above)
The dx9 path seems very efficient - I've got everything on max at 1680x1050 and it's super smooth. Shame AA doesn't seem to be working as there's actually room for it in this game. No idea what the actual frame rate is, but it's sufficient to run V-sync and still never notice any slows.
Lack of AA is very apparent.........at the very beginning of the game I was admiring the lovely water and then looked up to see major jaggies on the tail of the plane =S
There also seems to be something not quite right and I can't put my finger on it. I am pulling 12.5k in 3D Mark 06, have a GTX @ 670/950.......yet still it doesn't seem very smooth, regardless of what resolution I put it on.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Try turning V-sync on - it actually seemed to make it smoother for me - an illusion of course, but the impression is what counts
I can't play without vsync - tearing is far more annoying to me than sometimes having slightly lower framerates. Maybe that's just me though.
No vsync is especially bad in this game though - try the very first section: when the first splicer you see attacks the top of the bathysphere you're in, there's a lot of flickering from the electricity - without vsync on this looks absolutely horrible: half the screen light, half dark.
There are currently 1 users browsing this thread. (0 members and 1 guests)