Just a thought, probably rubbish but here goes..........
I read somewhere on a forum that perhaps frames wre capped at about 32 fps - presumably like in BoB1 cardbase and BDG patches to stop the stutters that plagued the original game.
I suffer from low fps which doen't seem to change no matter what options I raise or lower - constantly stuck at mid teens.
Could this limiter be affecting us low fps'ers somehow and dividing it up incorrectly so our frames never raise?
Any comments?
Frame Limiter?
-
- Senior Airman
- Posts: 187
- Joined: 26 Aug 2005, 12:12
- Location: Wales, UK
If the game would be able to run at say 28 Hz average,, the limiter might reduce frame rate to say 27 or 26 but make slow frames faster, IOW there is less stutter. But it does not reduce from 28 to say 14, so if you get that low fps, it will not be connected to the limiter. But as a test, I will look at making it optional. This will then also show how much "reserve" those people have that hit the limit.
-
- Airman
- Posts: 33
- Joined: 04 Aug 2005, 14:52
- Location: Berlin
-
- BDG
- Posts: 239
- Joined: 11 Feb 2005, 10:10
- Location: Cardiff, South Wales, UK
I must say the stutter is very reminiscent of the old problem of the Video card timings which the cardbase file solved with the FORALL_MIN =40 setting. Maybe it is another incarnation of this problem. Because no matter what settings I change/decrease the Fps stays about the same or maybe a very small increase.
SD
SD
-
- Staff Sergeant
- Posts: 269
- Joined: 10 Feb 2005, 19:56
- Location: Southern Ontario, Canada
I'm confused.
As far as I know, the human eye cannot detect anything over 24 or 25 FPS. So why would anyone want anything over 32
I limit all my games to about 30 FPS if there is a setting - that allows me to run at 1600x1200, 60 Hz. with everything on max. (usually).
What am I missing?
Rich
As far as I know, the human eye cannot detect anything over 24 or 25 FPS. So why would anyone want anything over 32
I limit all my games to about 30 FPS if there is a setting - that allows me to run at 1600x1200, 60 Hz. with everything on max. (usually).
What am I missing?
Rich
I've got 40 109's cornered over Berlin!
This could be a topic of long debate, but current thinking is that the *average* human eye is most comfortable at 85 fps/Hz or higher. Certainly, fluidity of motion is enhanced with these higher rates, and so I believe the human eye stops detecting choppiness of motion at about that rate. Not surprisingly, this is the rate that is most popular on modern CRTs as lower rates proved to give people a lot of headaches from the "flicker".RichardIII wrote:I'm confused.
As far as I know, the human eye cannot detect anything over 24 or 25 FPS. So why would anyone want anything over 32
I limit all my games to about 30 FPS if there is a setting - that allows me to run at 1600x1200, 60 Hz. with everything on max. (usually).
What am I missing?
Rich
At any rate, I can definitely tell a difference between even 60 and 85 fps. Both refresh rate and fps should be at this number or higher.
I could prove my point to any of you if you really want, but it gets sort of complicated.
aargh.. i HAD a link to a paper on this very subject - which was kept because this conversation comes up often, but i'll be buggered if i can find it.
also posted some thoughts on it here http://www.diesimfanboi.com/phpBB2/view ... ght=frames
edit: aah.. HERE it is http://amo.net/NT/02-21-01FPS.html
also posted some thoughts on it here http://www.diesimfanboi.com/phpBB2/view ... ght=frames
edit: aah.. HERE it is http://amo.net/NT/02-21-01FPS.html
-
- Airman
- Posts: 33
- Joined: 04 Aug 2005, 14:52
- Location: Berlin
Err guys, don't mix up refresh rates of monitors with ingame frame rates!!!!
The refresh rate of your monitor is sligthly different from the frame rate you get in games. Two ifferent things really! You can get a super smooth fps rate of 60 in a game, but your monitor can flicker at 60 Hz and strain your eyes. The average human is able to tell a difference for an fps rate of up to 37 or so. There was a large scale study by Sony some time ago. That you have 24 or 25 frames in cinemas and still see a smooth picture is something different and has to do with the "motion blur" effect. On a computer, you need more than 25 frames!
A CRT monitor needs atleast 85 Hz for a flicker free picture, although i could tell a difference between 85 and 100 Hz. There is a reason why modern TV sets use 100 Hz technology. My old 17" Trinitron CRT was alwas set to 100 Hz atleast. With my old Elsa Erasor card, it could even run at 150 Hz, although that's useless, it was nice nonetheless
A TFT display can run flickerfree wih 60Hz. Running a CRT with less than 85 Hz can ruin your eyes and cause headaches!
The refresh rate of your monitor is sligthly different from the frame rate you get in games. Two ifferent things really! You can get a super smooth fps rate of 60 in a game, but your monitor can flicker at 60 Hz and strain your eyes. The average human is able to tell a difference for an fps rate of up to 37 or so. There was a large scale study by Sony some time ago. That you have 24 or 25 frames in cinemas and still see a smooth picture is something different and has to do with the "motion blur" effect. On a computer, you need more than 25 frames!
A CRT monitor needs atleast 85 Hz for a flicker free picture, although i could tell a difference between 85 and 100 Hz. There is a reason why modern TV sets use 100 Hz technology. My old 17" Trinitron CRT was alwas set to 100 Hz atleast. With my old Elsa Erasor card, it could even run at 150 Hz, although that's useless, it was nice nonetheless
A TFT display can run flickerfree wih 60Hz. Running a CRT with less than 85 Hz can ruin your eyes and cause headaches!
Who is online
Users browsing this forum: No registered users and 11 guests