Games with only a few players (say 8 or less) can send all updates to all other players. In these types of games, most players are usually grouped together anyway, and even if they are not the overall bandwidth is low enough it’s not a big deal. The “wasted” bandwidth isn’t a bad thing necessarily. It makes programming easier, it’s more accurate, and you can do things such as game-level radars easily.
Games with more players, such as your average shooter (32-64) should only send updates to players that are nearby. The way they usually do this is to base frequency of sends on relevance. A player farther away gets fewer sends, a player I am facing gets more frequent sends. This is an O(n^2) problem because for every player I have to determine the orientation to every other player. In the worst case scenario of 64 players, this is 4096 operations. However, since this occurs at a maximum interval of your packet send rate, it’s not a big deal.
MMOGs such as mine, with many players (500 on one server) have the O(n^2) scenario as above, but there are too many players now. This would be 250,000 operations doing it the straightforward way. In my particular case, while I have a maximum send interval, I cheat on this in many ways, so in essence I need to recalculate this every frame.
The solution I came up with is to split up the world into a grid. Every update all objects go into one or more cells of the grid, depending on how big they are. This is an O(n) operation, and a pretty fast O(n). Then for each player, I query the grid to get the nearby neighbors. This goes into a sorted list, which I use to get a list of all nearby neighbors, as well as checking if a particular object is nearby.
Total cost is O(n) for the grid sectorization, and O(n) to go through each player to sort the nearby neighbors. Inserting into a sorted list is slower, but since there are on average only 10 items it’s not a big deal.
An “operation” is a bit misleading since different things that I do have different costs. But for the sake of argument…
500 players each with 10 objects nearby =
500 operations to place the players into the grid +
500 insert nearby objects. Assuming each insert is O(1) (it’s actually O(Log2(m) + O(m) where m is the number of nearby objects (10)), then this is Sum(1…10) = 55.
500 + 500*55 = 28,000. It’s a lot of operations, but still far less than 250,000, and scales linearly.
I further decrease this by using the fact that network nearby neighbors doesn’t have to be perfectly accurate. It’s good enough to update every 100 milliseconds, instead of every tick. So every 100 milliseconds I do the full-blown update. In other ticks I just cull out the deleted pointers and use the previous list.
This brings it down to an average of 933 operations per tick, assuming we do 33 milliseconds ticks.
2 replies on “MMOG nearby neighbor network culling”
Woot!
This was the subject of the my ending thesis for the University. Unfortunately I haven’t translated it to English yet (it is written in Portuguese). The title was “Client/Server for coordinate based multiuser systems” and we’ve adopted the same technique to select witch client we had to send to each other.
In a hypothetical situation that the clients where distributed with homogeneous space with them, we calculated that the algorithm could make about 2,5% comparisons of the brute force comparison algorithm.
The code of the prototypes are pretty much unreadable, but it is available at http://www.mytos.com.br/files/code/articles/tridrac.7z (yep, I use 7-zip) and the book its here: http://www.mytos.com.br/files/code/articles/book_pt.pdf
But the interesting part is that one of those days I was reading the “Game Programming Gems 2” and I saw an article called “Direct Access Quadtree Lookup” by Matt Pritchard. It explains how to access directly the level of the quadtree that the object is located without traversing all levels. That could be very nice to use when you have objects with different sizes (in my case I was considering just point objects with no area).
I have in mind a similar grid aproach, with entities stored in a hash map (each grid is identified by a hash). But I havent implemented it yet.