Top questions to ask before you launch a multiplayer game

With so many moving parts, it’s easy to miss a step when launching a multiplayer game. Here’s our checklist to help you cover it all.

Whether you’re a seasoned vet or an indie developer working on your first title, setting up multiplayer for your game is always complicated. It can be all too easy to miss a step and leave yourself in the lurch. Hell, even pilots have a checklist before takeoff.

So we thought we’d put together the top questions you should be asking yourself. That way, you can be sure you’ve covered the essentials before you launch your multiplayer game.

Should I use client-server or peer-to-peer?

We’ve talked about the difference between client-server and peer-to-peer before, so we won’t get into the nuts and bolts right now. But we recommend going for client-server (a server hosts the game, rather than one of the players) if you’re making a competitive game, need to take input from lots of players at once, have a lot of computing needs or have more than four players in a match.

Either way, make sure you decide which style of hosting you want to use early. Otherwise, you might run into problems later down the line.

Where should I host my servers?

The distance between your server and your players is really going to impact your latency. So make sure you look at where your players are based and estimate how much traffic you’re likely to get from each location.

Preorders are one way you can estimate how popular your game is in each region. And can also give you a good idea of the range of locations you might need. Start broad – what percentage of your players are in Europe, North America, Africa, Asia?

You want to keep breaking them down until your regions are roughly equal. If you have 5% of your players coming from Asia, another 5% from Africa, but 40% from North America and Europe, you probably want to chunk out those bigger regions into 5% buckets.

How much redundancy do I need?

Does it matter if your servers go down? If you’re making a competitive first-person shooter, they’re probably an essential part of your game. If you’re making a turn-based RPG, maybe it’s a nice-to-have and a bit of downtime wouldn’t be too disastrous.

Think about the worst-case scenario. How likely is it you’ll get surges of players in a region? How reliable are the providers in that location? How many players would it affect? From that, you can decide how much redundancy you might need for each region.

What’s the maximum latency I’ll allow?

You want the lowest latency possible, obviously. But realistically, you can’t expect every player to be sitting next to a data centre, plugged straight into the hosting machine.

For an intense, competitive game, you’ll ideally want a latency of around 30ms. But maybe you’ll stretch to 60ms as an absolute maximum. With that in mind, you can plot out how many servers you’ll need in a region to get that coverage. The lower the latency, the more servers you’ll need.

But maybe latency isn’t so important to you. For example, a turn-based game could drift into the triple digits, and probably still feel pretty responsive. Test it out and note down at what point it starts feeling frustrating as a player.

How many players do I want in a session?

Are you making a battle royale with a hundred players in a single match or a more streamlined 5v5 sports game? Are you going to have different modes?

The number of players you have is going to directly impact how much computing power you need on the server side. More players, more power, more cost. There are ways you can mitigate that – such as getting the player-side image to do some of the heavy lifting. Knowing how many players you’re expecting will help you determine how much you need to optimise your game.

Which leads us neatly to our next question.

What’s the resource profile for my game?

Your resource profile is how much computing power the server needs to actually run your game. The bigger the resource profile, the fewer sessions you can host on a single server.

If you’re making an auto-chess game, where the outcome is always predetermined based on what creatures the player picks, maybe you could do the calculations separately on each player’s machine. As long as there aren’t any random values, it’d run identically in both versions.

But in other cases, the server might need to take on that role. In first-person shooters, this is a common problem. Which machine decides whether the shot landed? The players? It’d certainly feel fair for the shooter, but the victim might have already rounded the corner on their screen. In which case, you might want the server to decide.

What else might you be able to delegate to the player’s image? And what absolutely needs to happen on the server? Physics calculations or anything tied into a random number generator, for example, might need to happen centrally to make sure it displays the same for each player.

Have I stress-tested my game?

During your testing process, you’ll want to see what happens if you overload your servers. If you can, try and get a large group to pound the servers at the same time.

If you don’t have real people, you could always use bots. Get a thousand virtual machines to all attempt to join a match at the same time, from the same location. What happens? Does every instance get into the match? And how long are the wait times?

With that knowledge, you can figure out where your bottlenecks are. For example, maybe your matchmaker just can’t filter through the requests quickly enough. In this case, you could set up multiple instances of the matchmaker to get through the queue quicker.

How will I roll out new versions?

As a general rule of thumb, never hard-code anything that might change. Make sure that you can easily configure a variable on a server somewhere, rather than needing to roll out a completely new version.

In fact, both Brian Jordan from Doborog and Stephen Clayburn from Bungie mentioned this in our fireside chat at Devcom earlier this year.

Eventually, you’ll always have a new version, though. So how will you deal with that? What if the update comes out while someone is playing? Do you boot them out of the game to go update? Or do you keep a few older versions running and let them peter out over time? For example, you might get the player image to tell your orchestrator which version it’s running. The orchestrator can then have the latest two versions of your game image prepared, essentially treating it like two separate games.

We’re actually working on a feature right now that lets you put a server in a ‘combined’ state. Alive, but dying. It won’t accept new players, but it won’t fully switch off until all the players have left. This is particularly useful if you have a mode that is more “drop-in, drop-out”, like a dungeon instance.

How will I report my metrics?

You’ll want to make sure you can grab data after a match. But how exactly will you do it? Will you create a file on the client-machine and download it after the match? Will you stream live data to a separate server or third-party tool?

What other services will I need?

Matchmakers. Cross-platform services. Player databases. You might have multiple backend systems that you need to request information from. How are you going to do that?

Ideally, you’ll want to simplify the process as much as possible and make sure that your different services can understand each other’s data. Keep your variables consistent between those third-party tools so you can easily share the data.

Do I need help figuring out my infrastructure?

Getting together providers, figuring out your architecture and deciding how many servers you’re going to need can be a daunting task. But if you need a hand, we’re here to help. Just get in touch and we’ll work with you to decide on the best way to set up your multiplayer infrastructure, linking it all together with our orchestrator.