im pretty sure im using the right cable and fraps does say 120 but amazon me another cable so i can try a different one. Another thing why haven't we seen any 2k tn monitors?
im pretty sure im using the right cable and fraps does say 120 but amazon me another cable so i can try a different one. Another thing why haven't we seen any 2k tn monitors?
- No it wont. So long as the refresh is consistent, you wouldn't notice any difference. The response time (measured in milliseconds) of a screen may show a difference in performance. but the Hz cycles will merely make motion smoother. Technically the eye is only capable of tracking individual images upto 25 Hz (25 repetitions - or frames per second (FPS)). Its easy to ascertain how remarkably fast the refreshes of your screen are at 60 Hz. The screen is refreshing that fast all the time. Is it flickering? No. Yet it is still rendering the picture 60 times a second. 120Hz is useful for 3D because there are two images overlayed on the screen at the same time, so you have 60 Hz for each eye. If your screen has a bad response time (analogue vs DVI), you will see a blurring effect because of the lag from processing to display. The only link between flickering / jerky images and frame rate would be in graphic rendering in a game. The cause of flickering in this respect would be due to the abilities of the graphics card or the settings in the game. In order to provide smooth gameplay, you need only set the graphics settings to allow for the frame rate to be able to render at a consistent rate.
As for using a TV as a monitor, do it man! A big screen is always more fun! You could always hide the damaged pixel point with a permanent marker! Just dont do a massive dot - lol!
Last edited by xodianbarr; 13-03-2013 at 07:09 PM.
The human eye takes approximately 1/25th of a second to process an incoming image. A human can perceive 25 different images per second or perceive the same object 25 times in a second.
Read more: Speed of the Human Eye on Moving Objects | eHow.com http://www.ehow.com/facts_7712170_sp...#ixzz2NRg3PrOC
Haha hrre we go again. If I had a quid for everytime a forum has this argument. Sit doen and run you game at 25fps then 60fps and then run at 120 then 300 if you can. Then tell me you cant see a difference between them all.
Seeing is believing my friend so stop it with the linky stuff![]()
http://www.100fps.com/how_many_frame...humans_see.htm
imagine yourself in a very dark room. You have been there for hours and it's totally black. Now light flashes right in front of you. Let's say as bright as the sun. Would you see it, when it's only 1/25th of a second? You surely would. 1/100th of a second? Yes. 1/200th of a second? Yes. Tests with Air force pilots have shown, that they could*identify*the plane on a flashed picture that was flashed only for 1/220th of a second.
xodianbarr is actually talking about the time it takes to process an image, which is a bit different to the plain old FPS argument. I think all the "The eye can only see XX FPS" trolls have mostly vanished from HEXUS due to repeated bashing
The eye doesn't see in FPS. Everyone has an upper limit where the perception becomes unnoticeable. Most people can see into the hundred FPS range without any issue - although more often that not, they don't care. It's really quite simple
I could start on 120Hz CRTs vs 120Hz TFTs if you'd like![]()
Couldnt have put it better myself agent...
@Azazl187, FPS is nothing to do with Hz refresh. If you watched something at 300 FPS you would miss most of it. But that bears nothing on the point i was making which was a 120 Hz tv is not a performance increase. A television using a digital connection, with a fast response time is what makes a good tv, not just Hz. If 120 Hz tvs are so impressive, why havent we all rushed out and bought them? Reason - because until 3d became available, it wasn't necessary.
@ Agent. If you want a proper reference you'll have dig out a textbook in biology. I learnt that at school buddy! Anyway, it was a first hand report. You can cite it. Though obviously a stronger cite would be preferable.
At risk of going a little off topic here, are we talking about Monitors for Gaming or TVs. Both are very different in regards to refresh rates...
at one point maybe but now it seems biology and the input specs of the eye lol my 2 cents on it is we can perceive incalculable resolutions and refresh rates etc for gaming going from 60hz to a 120hz that difference is probably not dramatic enough for us to notice a massive difference (increased refresh rates is not the same as increased resolutions its harder to notice)
P.s. going a little off topic is there anywhere in london i can try 4k or even 2k gaming?
As was hinted at earlier the refresh rate the monitor was designed for. This was best seen on the old CRTs where the phosphor chosen, and hence the fade of the last frame defined the refresh rate. In that old example imagine phosphors designed for 120hz, they'd need to fade very quickly and so if run at 60hz. There would be a more visible flicker. However if designed with slow fade, the previous frame wouldn't have faded enough for it not to blur into the next, and hence the benefit lost.
Sure LCD screens don't work the same but don't forget that the final display electronics will be designed with similar "fade" in the form of capacitors acting as buffers.
So I reckon a properly designed 120hz monitor probably does benefit, however a cheaper attempt probably doesn't.
There are currently 1 users browsing this thread. (0 members and 1 guests)