PDA

View Full Version : Framerate/Game timing



HopeDagger
28-01-2004, 08:18 PM
I wasn't sure what this method is called, so I casually labeled it "game timing". :P

Anywho, I need a way to control the speed of all of the objects/events in the game in a way that isn't based upon the framerate. Without it, this would be especially hectic in an online game, since players with a better framerate will be able to move faster, etc.

How can I allow everything move/occur at the same speed regardless of the framerate?

Thanks!

Alimonster
28-01-2004, 09:12 PM
You have to make sure you take time into account when doing your object movement. For example, if you had the following:

SomeObject.x := SomeObject.x + 2; // move it to the right

Now, that's all fine and dandy, with your object continually moving to the right 2 pixels each frame. However, if your frame rate isn't constant then you have to adjust the movement -- if a frame takes twice as long, say, you'd want to add 4 instead of 2 to object.

Thus, accounting for the frame rate is simply a multiplication (the longer the frame takes, the more you want to move the object) based on the elapsed time.

First, calculate how much time has passed since the last frame -- this is easy enough if you simply note the current ticks using GetTickCount, timeGetTime (with appropriate init/deinit code if under NT/2000/XP, ask if interested), or QueryPerformanceFrequency/QueryPerformanceCounter each frame (keeping the old one handy, of course, for comparison). You simply grab a value each frame, compare it to the previously grabbed value, and do a subtraction. You might want to adjust the value to be in seconds rather than ticks -- if so, divide by 1000 for GetTickCount or timeGetTime, or by the QueryPerformanceFrequency value (store this in a variable rather than calling it every frame) for QueryPerformanceCounter.

If you do this then you simply have to think of your object speeds in terms of "pixels per second" rather than "pixels per frame". Collision detection might become a bit more annoying, mind you, so you might think about limiting the maximum allowed time-since-last-frame in case of really bad glitches (e.g. hard disc thrashing causing bad jumps). You can also consider keeping lots of old values handy (maybe about 8 or so) and taking the average of them, rather than the immediate time for the last frame. This would possibly make things smoother, or maybe not (experiment a little).

I don't know whether DelphiX's timer lets you get how long has elapsed since the last timer update (I'd assume it would). If not, here's the unit that I use (http://www.alistairkeys.co.uk/GameTimer.pas) (let me know if you find bugs). It's easy to use: just create a TGameTimer, call Update once per frame, then multiply all your object's movement values by the ElapsedTime property to get movement based on time.

The net result of time-based movement should be that all computers give correct results, but slower ones give jerky movement (which will still be correct, just a little ugly). But that's not something you can account for, really, if the frame rate drops like that. All computers, though, should calculate their objects to be at the same position at a given time.

Useless Hacker
28-01-2004, 10:21 PM
[I moved this 'cos it didn't seem to have much to do with DelphiX.]
In the DelphiX timer's OnTimer event, you can use the LagCount parameter to get the frame time.

Pyro
27-02-2004, 06:07 PM
"with appropriate init/deinit code if under NT/2000/XP, ask if interested"

Interested :) I assume this would affect the timer returning about 64 'ticks' per second, leaving me with about 15 ms timing :/

Currently i have to revert to using the whole asm 'rdtsc' thing which is waaaaaaaaaay too accurate for my needs (and prolly differs from pc to pc as i understand it)

Legolas
28-02-2004, 04:59 PM
Alimonster,
seems that all files on your site have been deleted:

Forbidden
You don't have permission to access /GameTimer.pas on this server.


:cry:

Alimonster
28-02-2004, 10:22 PM
Legolas: yep, I found out about that a few days ago and haven't had time to do anything about it. :( My site was due for renewal at March 19th 2004 (i.e., a few weeks), but it seems that the guy was a bit hasty in taking it offline (unless he b0rked a script or something).

But don't worry, I'll get a new version of the website up and running sometime soon (and it'll hopefully be php + database driven, so I don't have so much hassle). I'm working on that (got Apache, PHP and MySQL downloaded so I can build it on localhost). Anyway, if anyone wants any files then just private message me and I'll send 'em to you. Note that you can also see old versions of the site using archive.org (http://web.archive.org/web/*/www.alistairkeys.co.uk), though of course that's not up to date.

But anyway, this is beside the point. The timing thing for timeGetTime is pretty straightforward - you want to use the following functions: timeGetDevCaps, timeBeginPeriod and timeEndPeriod. timeGetDevCaps returns the minimum supported resolution of the timer, which you can then pass to timeBeginPeriod and timeEndPeriod (once, at the start and end of your program). This changes the default NT/2000/XP timing of something like 5 or 10 msecs precision down to, hopefully, 1 msec (what you want). So the example code, ripped directly from one of my units:

//------------------------------------------------------------------------------

var
TicksPerSecond: Int64;
DevCaps: TTimeCaps;

procedure InitGameTimer;
begin
FUseTimeGetTime := (QueryPerformanceFrequency(TicksPerSecond) = False);

// if we're using timeGetTime, we need to set its resolution to the best
// possible - if we don't, it will be jerky on NT-based systems
if FUseTimeGetTime then
begin
FTimeFactor := 0.001;
FillChar(DevCaps, SizeOf(DevCaps), 0);
TimeGetDevCaps(@DevCaps, SizeOf(DevCaps));
TimeBeginPeriod(DevCaps.wPeriodMin);
end
else
begin
FTimeFactor := 1 / TicksPerSecond;
end;

if FUseTimeGetTime then WriteLog('** Using low performance timer! **')
else WriteLog('** Using high performance timer! **');
end;

//------------------------------------------------------------------------------

procedure KillGameTimer;
begin
if FUseTimeGetTime then
TimeEndPeriod(DevCaps.wPeriodMin);
end;

//------------------------------------------------------------------------------

Alimonster
28-02-2004, 10:31 PM
And just for historical purposes, here's a version of GameTimer.pas that I'm using just now. I think I've got rid of the dependencies, but if not it should be easy enough anyway:

unit GameTimer;

interface

const
TIME_AVERAGES_LENGTH = 8;

type
TGameTimer = class
private
FLastTimes: array[0..TIME_AVERAGES_LENGTH - 1] of Double;
FThisTime: Cardinal; // index in above array for current time

FLastTick: Int64;
FElapsedTime: Double;
protected
public
constructor Create;
destructor Destroy; override;

procedure Reset;
procedure Update;
property ElapsedTime: Double read FElapsedTime;

class function UsingHighPerformance: Boolean;
class function GetTimeFactor: Double;
end;

procedure InitGameTimer;
procedure KillGameTimer;

implementation

uses
//misc,
Windows,
MMSystem;

//------------------------------------------------------------------------------

var
FUseTimeGetTime: Boolean = False;
FTimeFactor : Double;

//------------------------------------------------------------------------------

class function TGameTimer.UsingHighPerformance: Boolean;
begin
Result := FUseTimeGetTime;
end;

//------------------------------------------------------------------------------

class function TGameTimer.GetTimeFactor: Double;
begin
Result := FTimeFactor;
end;

//------------------------------------------------------------------------------

constructor TGameTimer.Create;
begin
inherited;
Reset;
end;

//------------------------------------------------------------------------------

destructor TGameTimer.Destroy;
begin
inherited;
end;

//------------------------------------------------------------------------------

procedure TGameTimer.Reset;
var
i: Integer;
begin
FThisTime := 0;

for i := Low(FLastTimes) to High(FLastTimes) do
FLastTimes[i] := 0;

FElapsedTime := 0;

if FUseTimeGetTime then
FLastTick := TimeGetTime
else
QueryPerformanceCounter(FLastTick);
end;

//------------------------------------------------------------------------------

procedure TGameTimer.Update;
var
ThisTick: Int64;
i: Cardinal;
RunningCount: Double;
begin
if FUseTimeGetTime then
begin
ThisTick := TimeGetTime;
end
else
begin
// use high performance timer
QueryPerformanceCounter(ThisTick);
end;

FLastTimes[FThisTime] := (ThisTick - FLastTick) * FTimeFactor;
FLastTick := ThisTick;

FThisTime := (FThisTime + 1) and (TIME_AVERAGES_LENGTH - 1);

RunningCount := FLastTimes[0];
for i := 1 to High(FLastTimes) do
begin
RunningCount := RunningCount + FLastTimes[i];
end;

FElapsedTime := RunningCount / TIME_AVERAGES_LENGTH;
end;

//------------------------------------------------------------------------------

var
TicksPerSecond: Int64;
DevCaps: TTimeCaps;

procedure InitGameTimer;
begin
FUseTimeGetTime := (QueryPerformanceFrequency(TicksPerSecond) = False);

// if we're using timeGetTime, we need to set its resolution to the best
// possible - if we don't, it will be jerky on NT-based systems
if FUseTimeGetTime then
begin
FTimeFactor := 0.001;
FillChar(DevCaps, SizeOf(DevCaps), 0);
TimeGetDevCaps(@DevCaps, SizeOf(DevCaps));
TimeBeginPeriod(DevCaps.wPeriodMin);
end
else
begin
FTimeFactor := 1 / TicksPerSecond;
end;

// if FUseTimeGetTime then WriteLog('** Using low performance timer! **')
// else WriteLog('** Using high performance timer! **');
end;

//------------------------------------------------------------------------------

procedure KillGameTimer;
begin
if FUseTimeGetTime then
TimeEndPeriod(DevCaps.wPeriodMin);
end;

//------------------------------------------------------------------------------

end.

Useless Hacker
29-02-2004, 09:05 AM
Is there any benefit to setting timeBeginPeriod to the DevCaps.wPeriodMin rather than just using a value of 1?

Alimonster
29-02-2004, 10:01 AM
There's always a theoretical chance that a device won't have a minimum resolution of 1, in which the function wouldn't be able to adjust the timer and you'd be left with the same old crappy default value. I don't think that's likely in practice, mind you, but timeGetDevCaps is the "correct" approach to this and it's not as if there's a lot of supporting code. ;)