That's true of most PC components, and most technology in general.Originally Posted by Deckard
Is that true? All the comparison shots I've seen in graphics card reviews seem to show that Anisotropic filtering increases the detail level in the distance. That brings me on to another area where more graphics card power helps: increasing the draw distance. At the moment developers still have to use fog effects(?) or risk popups if the card doesn't have the power to render every single polygon in the scene. In Tribes 2 (which I'll have to keep using as an example as it's the only game I play) a sniper 400m away on a hill apprears as just a pixel or two on my screen at 1280x960. I reckon someone playing at 640x480 might not be able to spot them, whereas I frequently do. If I was a better player I might actually be able to do something about them. Anyway, I think that being able to cram more detail onto larger screens could well improve the gaming experience, quite aside from the debate over whether gorgeous looking games are actually more fun to play. Having enjoyed the demo levels of Far Cry, I thought the eye candy did genuinely add to the atmosphere.What really makes me laugh is that competitively, most player knock all the bells and whistles off to give them a better view of whats going on in the game... yet here they are with £400 pounds worth of card not even close to breaking a sweat.
As for the number of FPS the eye can actually 'see'- it is true that (depending I think on the eye's distance from the screen) 50-90Hz is enough to appear 'flicker free'. Films, BTW, have 24 frames per second, but each is displayed twice to avoid the flicker of the light being obvious. However, it is possible for the eye to notice images displayed for as little as 1/300th of a second, thanks to the persistence of vision effect. A higher refresh rate could debatably therefore help when gaming, although I imagine the law of diminishing returns sets in over 100Hz.
Rich :¬)