High level question: dedicated multiplayer servers with database

Hey guys!

This is my first post here and I’m trying to have a high-level discussion/question section about dedicated servers and using database with said servers.

I’m currently doing Unity Multiplayer (Mirror): Intermediate C# & Networking course - which is pretty nice - but I have some missing elements from it that I’d like to ask about in a general way.

Let’s look at different scenarios:

  • RTS games (in my understanding they work like this):
    • Client in menu: displays every information a player has: name, level, gold, shop, etc… It gets these information from the server which gets these information from a database. So when the game loads the server quires all the data it needs and displays it on the client.

      • Question 1: let’s say you’re running a successful game and you have like 10.000 players daily. Every query just to display said information would pretty much eat up every resource the database has, right? If we account for the additional queries to change player’s data it would melt down the database server, no?

      • Solution 1: Have more databases and sync them regularly, but it would create bigger problems as far as I’m concerned, no?

      • Solution 2: Upgrade the database hardware to be able to handle more requests, which is pricy.

    • Client in game: basically the same as in the menu, but it displays units, resources the player has in the current match, etc… These data are being saved on the server that handles the match, because it wouldn’t make any sense to save every unit’s position, gold, building positions, etc… only if there is a replay feature in which case everything is being saved every second (?) If there is no replay feature the game data is only being saved after each match, to display how many units you produced, how many resources you gathered… right?

      • Question 2: Is every match being handled by a dedicated server, if so how were they being deployed before containerization? Because nowadays you can just deploy containers with your servers any time a player wants to play a match, right? Did they use one server to handle multiple matches to be more efficient?

For instance let’s look at Clash Royale a mobile RTS / Tower defense game. How do they handle such massive player base? Obviously they are not an indie or even a small company, so they have many resources to burn, but I’m interested in how to handle such massive data flawlessly.

  • Simulation games (like Hay Day or FarmVille or Idle Heroes)

    • So they probably work very similarly to an RTS game, but the menu and game phase is merged together. So let’s say whenever the game starts the server quires your data from the database and loads it in so it doesn’t have to make a query every time you want to do something. Let’s say you’d like to build a house that costs X gold, since the server has your gold it checks if you can build it and if you can it deducts the gold from your balance and saves the new balance to the database and the built house data.

      • Question 3: So with this thinking if you make any small change, like building, spending gold, moving building, etc… you’re calling the database. So it’s leads back to my first question, but in this case it’s even more noticeable. How do you handle so many requests?

      • Solution 1: Save data less frequently, but this would ruin the whole point of saving everything into a database, because you’d lose data if the server quits life, right?

      • Solution 2: Save less data into a database, but it has the same problem previously mentioned.

  • Question 4: How do these servers work? I mean one server obviously handles more client in this case I’m pretty sure. Do they use a load balancer server which gets every request and relays it to other servers? If so how don’t these load balancer server burn up, since they get every request?

So hopefully I could communicate my high-level questions, because I think they are pretty interesting and I think a lot of people are as interested as I am, but have just a basic understanding like me about these topics. I’m really hoping we can open up a discussion about these areas.

I’m waiting for YOUR interaction! :wink:

Hi there, Lots of great questions here!

The first thing I would like to do is explain how most companies are working with servers. They are using a service (usually provided by Amazon, Microsoft or Google) to rent server time. These companies have their servers all over the world (close to potential users), and available for use. Generally, if you have a fixed amount of server usage you would just rent a fixed amount of servers (or if your scale was small enough you could have your own physical server). Most applications, like websites, webapps or games, have a constantly changing amount of server usage. So what you do instead, is pay for a variable amount of server usage. The server provider then provides servers as needed by usage and geographical requirements. The provider handles “spinning up” servers as need to provide the necessary response time and compute power to handle the traffic. You may know all this already, but it is a good primer.

For example, you may have 5 servers running to handle your minimum amount of traffic. Maybe at 7:30pm in your region, traffic goes up because more users are free to use your application. Your provider might then provide another 5 servers to handle the extra traffic so you have 10 servers running.

Questions 1: In this case, what you need to look at is how many queries are occurring at the same time, and how much data each query is sending and receiving. Those two components will bottle neck your servers. If you have 10000 players, chances that they are querying simultaneously for when they are logging in are actually relatively low. So what you should do is compare your expected usage against what the servers can handle to estimate your expected costs.

The solution is really a combination of your two solutions. If you have, or expect many users to be logging in at the same time and all making database calls, your server provider can scale up available servers to handle the extra calls. By using a large company that has many servers available, and only renting them as needed for spikes in traffic, you can keep your costs down. You may also want to have multiple databases (usually separated by geographical regions). Multiple database servers mean you get faster response times (as players are closer to the servers). As well, since most users are more likely interested in data in their region, the delay required to sync databases is acceptable. There are pros and cons to this of course, that depend on your specific application.

Question 2: Yes, one server can handle a number of matches depending on how powerful the server is. You (or your gaming server provider) can check the servers for available resources, and deploy new game instances to servers that have enough resources. The server is probably not calling to the database during the match as your suggest. Database updates can be done at the end of the match. Real-time data is probably being handled by the game server and only calling to the database if it needs new information. This is because the time it would take to call and hear back from the database would cause a large delay the game loop (lots of lag) and isn’t suitable for real-time games. (This behaviour probably varies wildly depending on the type of game.)

Question 3: Again depending on your application, the game server is probably not calling to the database very often. The game would probably get the cost of the house from the database when it the server is initialized, and would probably only save your gold balance back to the database when you initiate a cloud game save (e.g. Save & Quit, or auto save).

Again your solutions are correct. You would only save when necessary. Depending on your application you would balance convenience with costs. Minimizing the amount of data going to and from the database would also be an important part of optimizing your game.

Question 4: This definetly depends on your exact server technology. You have game servers that can handle a certain number of game instances. You have web servers that can host databases and handle a certain number of web requests. Again, when servers can’t handle the traffic, you either need more servers or have to wait for available resources.

This is my understanding about how much of this technology works. I am by no means an expert, and there are also many ways to implement the same solutions. For games, I would go read up on AWS (Amazon Web Services) or UGS (Unity Gaming Services) and understand how these companies host game servers. Our new Unity Netcode course covers creating dedicated linux servers with UGS and might be helpful to you to see how they run.

1 Like

This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.

Privacy & Terms