On 8/8/13 5:06 PM, Krysk wrote:
Hi Krysk

I had similar design constraints when wanting to match up to four human
players or computer players playing a card game. In the first monolithic
approach I got a better feeling for how long match making actually
takes, today we're seeing seldom more than ten tables being in the match
make process, while there are up to two thousand user playing cards. The
matchmaking for the game isn't the fun part, so users do away with it
pretty fast.

After observing user behavior for more than a year, I spread out the
game logic to separate game servers with a central matchmaking process,
maintaining all the metadata, doing the load balancing for the game
servers and handling the broadcasting of status and activity information
to players. Metadata stored and passed around is the usual stuff like
game skill level, likeability, friends, blocked users, number of games
played, and some. The data is kept in a MySQL DB, is fetched at log in
and passed around with the player instance.

This scheme so far balances very well and in case of needing to handle a
lot more users, I would separate the matchmaking process to a dedicated
machine.

The whole setup for more than 50k games played to the end per day (about
13mins average play time per game) is handled by an 8 core single
processor machine with 24GB of RAM, usually we do not run more than 5-6
game logic server processes. The machine is well balanced, extremely
stable, no runaway situation was observed since deploying the system two
years ago.

The bottleneck I foresee in our case is the 100MB/s connection we have
at the hosting center, currently we are only allowed one interface.

For me dodging the sharing of metadata for the matchmaking was crucial,
I didn't fear the sharing so much as the latency induced by sharing
metadata among processes or machines, because the added latency adds a
lot more incongruous stuff happening to the user's experiences. Match
making on screen with manually selecting partners puts quite a strain on
the imagination of the average user, with added latency to clicks and
and answers, the users shy away from match making and start playing
alone or with the much easier selectable computer players.

HTH, Werner

That sounds similar to the approach I was planning on taking. That
does leave the question of how do you manage events? For instance,
when the central server figures out an appropriate match, how do you
pass the relevant data about the match (players, etc) to the game
server that's going to run it?

I imagine you either put that data in MySQL and have all servers poll
it periodically, or you have some kind of direct notification system
that are servers are listening to. Can you clarify?

Hi Krysk

nah, no storing of information in the DB and polling was out of question, the system is built around spread (pb), with the game servers being detached processes, allowing to kill/restart the matchmaking process with the game servers then reattaching to the freshly started matchmaking controller. With this it becomes possible to do hot upgrading while the system is running.

The match making controller has an observer interface which allows an user wanting to play to register itself to the general state change broadcasts. Besides that it also has an interface with which to gather more information about another particular user by using its player id for querying the DB when desired. Another interface allows the user to poke a table owner with a request to play, this request is then passed on to the table owner.

The sequence of match making is as follows:

- a table with a certain game type and rule set is created via the controller in the least used game server, the so called 'owner' of the table is seated at that table

- the controller keeps a reference to that particular table, with the game server informing the controller of state changes, like the owner leaving the table (in this case the human player gets replaced by a computer player, picking up the cards, be it in the match making or the game playing phase

- other users observing the 'open tables' are now able to ask for a seat

- this request can be honored or turned down by the table 'owner', the asking user gets the game server coordinates (IP, port, table id, seat position...) and has now a window of about 30 seconds to sit down at the table, if this does not happen, the seat is reopened again

- if the table is full (all four seats taken, with either computer or human players) the match making controller detaches the table from the 'open tables' and moves it into the playing table list, with only minimal statistical information now available (table chat remains private among the four players), although it would be easy to tap into the chat. The table is now only under the control of the game server up until the users decide to leave the table. If a table has only computer players, it is automatically killed, the match making controller gets signaled

Great care was taken in the implementation of this system, that all information must be pushed to those in the need to know, there is no polling at all. All the information is kept in quite complex objects which can be passed via pb or kept in sync.

A single match making controller orchestrates the game setup, there can be any number of game servers on different machines.

The game servers itself are also offering a pb interface to the actual Livepage (Athena/Nevow) webservers, there can be any number of webservers on different machines

With all this in place, a dynamically scaling system becomes reality, with the additional advantage, that hot swapping of all the pieces of the system is possible

Hope this clarifies some corners, Werner






_______________________________________________
Twisted-Python mailing list
Twisted-Python@twistedmatrix.com
http://twistedmatrix.com/cgi-bin/mailman/listinfo/twisted-python

Reply via email to