I fugured out the basics, meaning, I should multiply all movement with the LagCount-variable, but that isn't enough.

When I design the game, I must know how much to add to X-position, to get a smooth movement, and also, for the ball to travel a specific distance in a specific amount of time. That means that I have to assume a framerate, wich I will use when syncing my game.

What framerate should that be? If I set it to 30 fps, and make my game play as it should be, anything bellow 30 fps would still run ok, since the LagCount-variable will take care of that. But then I am limiting my game to 30 fps, and a monster computer would not get any smoother motion at all, even if it is capable of rendering 150 fps. And this is not too good.


I am just not sure under what circumstances I am supposed to develop the application.



Edit: And one strange thing more: When I set the timer to 0 or 1, I get around 75 fps, but the LAGCount varies. One time-tick it is 1, the next 5 or six, it is 15, then 1, then another set of 15... Is this normal?