Had a discussion with a friend recently about the fps. He claims that because human eye can only receive around 30 fps then there is no need to go over 40, say for games. Why would anyone want to play a game at over 80fps, isn't it just a waste of power? (feels like this question is badly worded, hope I got the point through)
Why is this under Apple? In my opinion, a stable framerate is more important that pure numbers (once you're above 30) Even if it's stable, anything below 30 you'll still probably feel 'lag' or whatever.
I have been gaming for some time, and the move from Console to PC has opened my eyes. Console is limited to only 40fps. But when you are gaming at high detail with lots going on anything below 40 seems laggy. Hell on Battlefield 3 one of the most demanding games I own if it drops below 40 I know it. My system usually runs in the high 60's anyway.
Oops, still getting used to Tapatalk :-/ don't even seem to be able to change it from here :-( Sent from my iPad using Tapatalk HD
30 fps is generally recommended as a minimum framerate to enjoy smooth gameplay. Frames are close enough together to not seem like separate images to the brain for most things. Higher framerates for gaming can be beneficial for being smoother for fast motion. Think of quickly aiming in an FPS, move your gun in 1/30th of a second and it's effectively teleporting because there aren't any frames for motion. 120Hz monitors with games running at 120fps for example are often claimed to provide a better gaming experience in high paced games. (where's knuck at?)
I hardly ever seem to get any lag on my Xbox... If its limited to 40fps, how do they keep it so stable? Does it always run at 40fps?
It's other way around. Majority of people starts noticing 25 FPS as "not jerky", but that doesn't mean you don't notice higher FPS. But 25FPS is the "baseline" of what is considered a minimal usable framerate. http://en.wikipedia.org/wiki/Frame_rate
So what is the maximum fps rate then? Surely our brains can't register 100fps... *getting confused with biology now*
Well, technical limitations these days are 60FPS (60Hz for standard LCD displays) and 120FPS (120Hz for the "3D" displays. Biological limitations... i think you should ask a doctor about this .
I believe the eye can see at roughly 60fps. This is something that's been in my head for ages, most likely picked up on here. And I think possibly even at school, something to do with monitor flicker, and the big strip lights, and that 60Hz is the human eye refresh rate, and something...... , can't remember the story properly,
The eye can see more than 60fps. The more FPS the smoother the animation. What is important in a game or anything moving is constant FPS. If a game, let's say jump from 25fps and 45fps. You'll feel the game be choppy. However, if the game runs at 24fps and steady, you'll see the game play properly and perfectly. Don't believe me? Most movies at the theater runs at 24fps, do you see the movie being choppy? Nope. When it comes to 120-140 Hz monitor, it comes down to: -> Is it worth the premium? What I mean is, is it worth paying more for the monitor, sacrificing better monitors choice for the same price as a faster Hz one. Also, have the GPU power, and CPU power to play games at 120fps to enjoy the 120Hz monitor? The difference is minute. You will probably only notice it, if you have long experience with both, OR you have them side by side.
No, but part of FPS is perceived reaction times. At 30FPS; it's taking twice as long for the input you've just applied to be transmitted back to your eyes, providing feedback. So that's one way that things can feel "Smoother" or just "Faster" - Another is it's not just raw FPS. If you're getting Microstutter; you can be achieving 50FPS, but the uneven delay makes it feel a lot less persistent than that. Another thing to take into note is that our eyes and minds accustomed to whatever they're looking at. If you're using to only seeing 30 or 40 FPS; you'll feel that 60 is "Smoother", similarly; Most of us here with PC's are more used to 60FPS, so noticeable drops below that, usually anything below momentary dips to 50, feels a lot more jerky, even if the frames are being delivered in a consistent manner. And to reference the above Comment; Films don't feel jerky at 24FPS because we're simply watching. Limit your FPS in a game to 24 and see how disjointed and slow it feels, you require higher FPS, in part, because anything too low makes things feel detached and disjointed.
You don't see the input lag. You can see that when playing games on the console. Most games on console are 30fps, and the TV has a huge input lag already. No one complains, and you have crazy amount of FPS gamers, I won't be surprised if they are more on them than on PC.
Depends on the game. For me: An MMO or an RPG only needs maybe 20-25 to be playable. Racing games need a min of 40-50 Shooters tend to feel better at 60+ Your eyes may not be able to see at high fps, but you can definitely see and feel a difference.
You See and Feel because your eyes can see faster fps. It's your brain that does work to assume things, to see things smooth. The brain does a lot of stuff like that. That is why (I mean the part about the brain alters in some fashion the eyes sees), you don't see a black dot where the optic nerve meets to connect to your brain, even if you close your eyes, and why when you see a car wheel spins in one direction it looks like it turns on the different direction after some speed, and sees optical illusions.
I was just saying what I thought to be true. Though there might not be an actual cap. It might be a limitation from the developer.
To be clear. Game console always have V-Sync ON. So all possible FPS are: 15-30-60-120 and so on. Obviously, no games run at 15fps, nor 120fps or more, as 120fps and more requires some serious computation power, which even for PCs it's hard to reach. And 15fps, well its too slow. In the case of the XBox 360 AND PS3 -> The majority of 3D games runs at 30fps @ 720p. -> Many of the simple 2D games runs at 60fps/30fps @ 1080p. -> Many of the complex 2D games runs at 30fps @ 1080p. Of course there are exceptions, but in total the great majority of games runs at 30fps. In the case of the WiiU, with the current lunch titles and what is known so far about them: -> 30fps @ 1080p with 1x game pad screen in use -> 60fps @ 720p with 1x game pad screen in use Info from Nintendo from bits and peaces here and there... but no games to demonstrate any of this, so take it as a grain of salt: -> 60fps @ 1080p if no game pad screen is used -> 30fps @ 720p with 2x game pad screen are used
I can play at 30fps, but I'd rather have the game look a bit worst and get 60fps because it's buttery smooth
There is a huge difference for simply viewing something and interacting with it. For a film, 24, or 29.9 fps is the normal standards. But that is passive viewing. If you were simply watching a game over someones shoulder, 30fps may seem perfectly fine. But playing may be a different matter to how responsive it feels. While these are really totally different issues, they are somewhat related in perception. Higher frame rate and input polling time being closer will feel smoother to the player. There really isn't a top cap to fps. A gpu can go as fast as it can, way above the 60 or 120 fps that most monitors are capable of. I have some old games that go in excess of 300fps according to monitoring tools. It's totally imperceptible after it goes above 75ish to me, provided the display I'm using is even capable of that in the first place. While I can play a game at 30fps, I prefer closer to 60 for responsiveness.