As part of my studies I was required to write two bachelor thesis’s. The first purely theoretical and the second with a practical component. I was certain I wanted to write something about games. But I found the game related topics you can write a scientific paper about rather limited. You can write about computer graphics stuff and AI for the most part. I would have like to write about some gameplay related topic but wasn’t able to find something which I could write about on a scientific level. So I decided to write about something related to AI. But I had no idea what sub categories exist in the AI field. So I searched for a bit to find something which interested me and was small enough so I could write one or two thesis’s about. And I found the field of player prediction.
The term player prediction describes a collection of techniques which aim to predict the behavior of a human player. And I thought the same as many people do when hearing about this for the first time: Wow this sounds like awesome magic. How could an AI know what I do? But it’s not magic. It’s mostly statistics and probability theory. And I wondered ok, how good does that stuff really work? Because often when you read about exciting stuff, it proofs to be not as good as promised.
So started reading up what those techniques were and tried to understand how they work. In the end I decided to write about three techniques in my first Bachelor Thesis: Player Modeling, N-Grams and Bayesian Networks.
Bayesian Networks enable the AI to reason under such uncertainty. They take previous observations as an input and generate the expected state of the game world or a part of it. Of course the developer could give the AI all the information it needs but that would be the same as letting the AI cheat. Beside the applications in video games, Bayesian networks are used for medical diagnosis, weather forecast and the reconstruction of traffic accidents, among other things.
If you want to know more about the technical/math stuff behind that technique you might want to read my first bachelor thesis.
Player modeling describes a technique where a model about the player is constructed and used to predict his/her future behavior. Bayesian Networks and N-Grams both focus on predicting very specific gameplay decisions the player will make. This can be achieved by player modeling as well and is called action modeling. But besides that it’s also possible to model player preferences, for example, in situations the player will behave aggressively. This allows player modeling to have a broader scope than N-Grams or Bayesian networks. Pieter Spronck is a researcher who specialized in that area and if you are interested in it I recommend reading some of his work.
N-Grams are a technique used to analyze sequences and to predict future sequences based on the stored probabilities. In computer games this sequences are the actions of the player. For example in a RPG the actions would be ranged attack (R), melee attack (M) and heal self (H). The player would fight an NPC and choose the actions: RRM RRM RRH RRM RR. Now for a human it’s clear that the player is more likely to do a melee attack (M) than a heal self (H). N-Grams enable an AI to make such predictions. There were suggestions in the literature which indicated that it’s possible to actually make an AI in a fighting game unbeatable with this approach. I first thought: “meh, sounds like bullshit” but found out in my bachelor thesis that this could be very well true, depending on the game.
In my second bachelor thesis I focused on N-Grams and did a implementation of them in a Unity3D minigame. The game Blocky is a 5-way stone-paper-scissors game with colored blocks. Pretty boring to be honest but it served it’s purpose, which was to find out how well N-Grams really work. And they worked very good. I tested with 18 human participants and none of them were able to beat the N-Grams AI. A more detailed discussion can be found in my seconds thesis. The performance was also okish, and I think N-Grams can be used in a state of the art video game. As long as the possible number of actions isn’t too large.
You can try to beat the N-Grams AI yourself if you want to. If you are unsure about how to play, read it up in my thesis. If you want to browse the source code of the Blocky game including the N-Grams predictor you can do so on Github: https://github.com/mwebi/Blocky