🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Why keep client and server ticks in sync?

Started by
3 comments, last by hplus0603 5 years, 10 months ago

Hi, I'm working on an FPS game and have had no end of problems trying to get my networking off the ground. Right now I'm using snapshot interpolation for lag hiding of remote players. I've been reading basically everything available online about the subject and the theory all makes a bunch of sense but when it get's to implementing it I just get mired down in weird problems and end up doubting the approach completely. I've been stuck on this for too long now and figured it was time to ask for help.

My current approach is trying to keep the client and server ticks synced by using a floating tick offset based on round trip time(and influenced by reported difference between last acked tick + current server tick). What I don't understand is why keeping the 2 in sync matters at all. It seems to me that no matter what, you should be just taking user input off the stack and applying it to the most recent server state, instead of trying to backtrack or wait until the correct tick. You should then send all clients this most recent state with the most recent player command # so the client can reapply those commands. First blush it just feels like keeping the 2 ticks synced is just a way to make resim time more predictable or something?

Also when you're keeping the tick synced on client and server are you also storing game snapshots on the client based on its local state or are you only storing snapshots for frames the server has sent you? If you are only storing snapshots the the server has sent you, it also feels weird to have a tick # on the client at all.

What I keep going back to in my mind is that the server has a tick # and the client just has a list of recent snapshots received from the server and a list of yet-to-be-acknowledged commands. The client processes local input and sends to server with its local command #. Server receives input as quickly as it can(with some small jitter buffer just to keep them spaced out), processes user input received that tick, advances simulation, and sends the most recent gamestate to all clients with their individual command #s. Clients receive update, reapply any commands > the command number received, etc. I don't know if it's just another way of saying basically the same thing but to my brain this makes more sense.

Does that make sense? Am I over thinking this?

 

Advertisement

You need to keep the rate of progress in sync, else the client may run faster or slower than the server.

You typically also want to keep the same tick number sequence, because that allows the game logic to say things like "this bomb will explode on tick X" and have everyone see it at the same time, with the same physical effects (modulo the local player, assuming it's ahead of time.)

 

enum Bool { True, False, FileNotFound };

If you want simulation to be consistent, then the client and server need to run SIMULATION at the same rates.

RENDERING doesn't need to be run at the same rate as simulation.

enum Bool { True, False, FileNotFound };

I would not think it's common to run the server at a low simulation rate, because you'll get more problems with objects tunneling through walls, extremely large collision forces, and so forth, when doing that. Networking rate may be run at 20 Hz, but simulation is almost always run much faster (60 Hz is common.) This means that you'll stuff data for three ticks into a single network packet. The rendering code can then use interpolation to run faster than 60 Hz. Another option is to run physics at, say, 240 Hz, and step many times between each rendering frame.

There is an alternative, used by Unreal Engine, which is variable frame time; it simulates as much or as little as your computer can keep up. (you can set a maximum step size, so it will run multiple steps if the time interval is too big.) However, on the server, there is no graphics, so it will typically simulate quite small ticks at a time. Variable frame rates end up causing jittery physics, and you obviously can't use lock-step / deterministic simulation this way.

Regarding "missing" inputs, your input API will typically deliver a key-down event and a key-up event; what you do if you get both of those events before you simulate physics again, is up to you.

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement