Since the human eye is limited as to how many images it can perceive per second, it ought to be unnecessary to play with more than 60 FPS. However, a higher FPS is preferable since the objects you see on screen will be more precise. If the graphics card generates 100 FPS for instance, an object that is situated at a given place at a given time will be more accurately depicted even though the screen only updates at 72 Hz.
Which is what Gangsta was saying about playing against people online who have higher framerates.
Now, in movies, explosions are usually shot at speeds in excess of 100fps (because they' re instantaneous), which is what i was saying about effects in games running at higher framerates.
Maybe 120fps is enough, maybe you will get headaches after 3 hours. Seeing framewise is simply not the way how the eye\brain system works. It works with a continuous flow of light\information. (Similar to the effects of cameras' flashlights (" red eyes" ): flashing is simply not the way how we see). So there are still questions. Maybe you need as much as 4000fps, maybe less, maybe more.
The fact is that the human eye perceives the typical cinema film motion as being fluid at about 18fps, because of its blurring. This also applies to games, in that it' s unlikely you' ll notice a difference between 30 and 60 fps, apart from where effects are concerned.
If you had a movie with 50 very sharp and crisp images per second, your eye would make out lots of details from time to time and you had the feeling, that the movie is stuttering.
72+ fps is referred to as the ' illusion of reality' , but that will only apply to games when the visuals are of photographic quality.
35-37 fps has been shown to be the concious threshold for depicting motion on a TV screen. After all, if TV looks smooth at 30fps, then that must be a hard-coded, physiological limit right?
There' s a big difference in the way that TVs' (be it SD or HD) and monitors display images. Inherent to a TV is blur, the picture is interlaced, meaning that 30 fps on a TV will look no different than 60 EXCEPT where effects are concerned (sparks, explosions, strobe lighting etc...). That' s why motion blur is coded into the software.
Like when something' s running at 90 fps, and somethings running at a steady 30 fps, it doesn' t feel like the 30 fps is only a 3rd as fast. Displays/TVs' render frames dynamically, so everything runs at the same speed, but more frames are packed into each timeframe (hence the literal term " ninety frames per second" ). And it certainly doesn' t mean you register 90 fps conciously.
Now jtype is a games technology lecturer and i' m a games development student, ...you are going to teach
us are you?!
Oh, and loco your airforce pilots NEVER registered a plane displayed for 1/300th of a second at all, it was 1/220th of a second and the test' s weren' t conclusive. Play a game and it' s different though, and it' s not like your framerates can even exceed your refresh rate anyway.
< Message edited by Majikdra6on -- 21 Feb 06 12:27:02 >