25 February 2009

Language and Games

Found at Paperpools:

Suppose I grow up in a family where people obsessively play Hearts. We switch around between different versions of the game - sometimes we play Black Maria, where the Queen of Spades costs you 13 points and you pass on three cards to the left before you begin play, sometimes we invent twists of our own. I also have four friends: A lives in a family of chess fanatics, B lives in a family of bridge fanatics, C lives in a family of go fanatics, D lives in a family of poker fanatics.

What I see at once is something remarkable. Languages are translatable, more or less; it may be more or less tricky, but it's intelligible to speak of Chinese being translated into Turkish AND Arabic AND English. Games are not translatable. Chess is a game for two players with complete information; you can't "explain" what's going on in a chess game in terms of bridge, which is a game for two sets of partners with imperfect information, a mixture of skill and chance which depends on skilful sharing of information between partners. And you can't "explain" either in terms of poker, which is a game for an indeterminate number of players, a mixture of skill and chance in which sharing of information between players would in fact be collusion and outlawed. A game is intelligible on its own terms - which means, paradoxically, that you can play a game with someone whose language you don't know, provided you both know the rules of the game.

You don't understand a game in terms of some other game, you understand it by learning to play it - but the more games you play, the more you will understand about the radical otherness of games.


Read the entire post and comments at Paperpools.

Hat Tip: Andrew Gelman at Statistical Modeling, Causal Inference, and Social Science, which also adds some insightful and mathy comments.

23 February 2009

Exploding D10

Classic Battletech RPG Role Playing gameSince I seem to be thinking a lot about dice, the Classic Battletech RPG (formerly Mechwarrior 3rd edition) use a unique dice method, sometimes referred to as "Exploding D10". The game uses two 10-sided dice for skill rolls, but it is a little bit more than just the sum of the two dice: If either dice rolls a "10", then 10 is added you your sum AND you roll that die again. If you get a bit of luck and roll consecutive tens, you can end up with quite a large number (your roll "explodes" upwards). This is unique in that it generates a right-skewed result, and is theoretically open-ended, meaning you might keep on rolling forever. That isn't a practical consideration though. Not only is it very unlikely, but for these skill rolls you only care about reaching some finite target number, and there is no need to keep rolling once your total exceeds that amount.

Probability two 9-sided diceThis business of rolling a "10" is the part that is different, so lets set that aside for a moment and consider what happen if we don't roll any 10's. Consider that no matter how many tens we might roll, we will always add the sum of two 9-sided dice to however many tens we might get. That makes this part easy; for a single "d9" die the chance of any outcome is 1-in-9, and for two dice the sum is a discrete triangular distribution very similar to that of (the sum of) two six-sided dice, which I wrote about last week.

Geometric Distribution p=0.10Going back to the roll of a "10" on a single d10, there is a 1-in-10 chance of getting a ten. If this is successful you roll again and again until you stop getting tens. The total number of tens rolled on a single die follows what is known as the geometric distribution (with p=0.10). This is essentially like flipping a very unbalance coin that has a 10% chance of coming up heads and 90% tails. We keep flipping it until "tails" comes up and count the total number of heads we get along the way (or keep flipping it until we fail to get "heads").

Probability Exploding 1d10

Rolling an exploding d10, lets call it d10X, is really a two stage process. First we flip the unbalanced coin, multiplying the number of "heads" by ten, and second we add to this a roll of our 9-sided die. Note that it is not possible to roll multiples of ten, because whenever we get a ten(s) we always add the d9 result to it. I cut this (and following) table off at 31 because it didn't seem useful to take the probabilities out father than that.





Probability Exploding 2d10



At this point I'm going to skip the gruesome arithmetic and get on to the result when we roll two d10X and add them together.
There is no trick to the calculation this time, I set up a spreadsheet for all combinations that sum from 2 up to 31 and tallied the results.








Probability Exploding 2d10
Finally (finally!) here is the right-skewed distribution I mentioned back at the beginning of all this. Note the odd little dip in probability at 19, which results because it is not possible to roll a 10 on single d10X. The repeats at 29, but is much harder to see (and 39, 49, etc.).

20 February 2009

Ad Astra on the Move

Ad Astra Games Squadron Strike gameAd Astra Games is relocating, with offices in Minnesota and Ken Burnside (President/Creative-guy) in Milwaukee. I met Ken last week while he was running a Squadron Strike demo at my local gaming haunt.



Ad Astra Games Attack Vector: Tactical game



If you are not familiar with Ad Astra Games, they are the makers of games based in real physics and engineering, such as the award winning Attack Vector: Tactical, Saganami Island Tactical Simulator (SITS), and Birds of Prey (which I now own and hope to review soon).


Ad Astra Games Birds of Prey modern air combat game
I expect Wisconsin gamers will benefit from having Ken in the area for demos, and I might sign up for his Squadron Strike campaign, time permitting.


Ad Astra Games SITS Saganami Island Tactical Simulator game

19 February 2009

Clue, and Minesweeping

Minesweeper gamegame clueNo, not Minesweeper, minesweeping.

Game Provides 'Clue' to Remote Sensing




This sort of makes sense; an efficient strategy to gather information in a game should be essentially the same as the strategy for finding real hidden objects. [Clue Image]

[Hat Tip 2 Purple Pawn]

16 February 2009

Dice Distributions

There is an interesting post over at The Endeavor about using dice to approximate a Normal distribution. John Cook points out that determining the distribution of 5 dice by brute force would be a lot of work, and uses a generating function for the calculation of the exact distribution. John's blog is always a good read, and his mathematics elegant.

Not me though, I'm going to use brute force.

I once figured out a fairly simple way to calculate the probability distribution of the sum of any number of dice. That was at least 25 years ago though, and until John's post jogged my memory I forgotten all about it. The only problem, was that I didn't recall the trick. I've been playing with this off and on all day, but I finally rediscovered how this works.

1d6 dice distributionFirst some simple stuff. If I roll one dice (and it is fair) then I expect an equal probability of each face landing up, for a 1-in-6 chance each.

Dice distribution 1d6

Summarizing the probabilities in a table, it looks like this.



2d6 dice distributionNow a little more complicated. Rolling two six-sided dice, there are 36 possible outcomes (36 = 6^2 = six squared). There is more than one way to roll every outcome except for 2 and 12, so we have to count them up carefully. The photo to the right shows all possible outcomes from 2 to 7. I ran out of dice before I could do 8 through 12, but I don't really need to because I know the distribution is symmetric. The chance of a 2 is the same as the chance of a 12, 3 the same as 11, likewise 4 and 10, 5 and 9, 6 and 8.

Dice distribution 2d6
And the summary. Note the probabilities go up from 1-in-36 (@2) to 6-in-36 (@7) and back down again. Because this literally makes a triangle if I were to graph it, this is sometimes know as a "discrete triangular distribution".


Dice distribution 3d6
Even my dice-maniac friends (except maybe Rick) don't have enough dice to demonstrate all the possibilities for three 6-sided dice, but here is a summary table for the probabilities of each roll. (And Rick, if you are reading this, that would require 648 dice.)


Dice distributions



And now for the brute force. I set up a spreadsheet to tally up the frequencies (total number of combinations for each outcome).
Note there is some similarity here to Pascal's Triangle, especially if you rearrange this table a bit.



Dice distributions





Rearranging the table was a good idea. The gray shaded cells turn out to correspond exactly to Pascal's Triangle, and I spent a lot of time trying to rediscover my "trick" on this basis. Each gray-shaded cell is the sum of the number in the cells immediately above and left (use zero if there is no neighbor above or left). This sum doesn't work for the rows below the first six though. Pascal's Triangle seems like a dead end, but I suspect there is some deeper relation I may have missed.



Dice distributions


Below the first six rows, I noticed that the difference between the sum of the green-shaded cells and the yellow-shaded cell is always the value in the cell one-left and six up (red-shaded). That is: add the green, subtract the red, and it is equal to the yellow.

Dice distributions
This is the trick I had forgotten, and it appears to work no matter how far I extend the table to sums of greater numbers of dice.
25 years ago I'm sure I never gave it such a test, and it's rather nice to know that I got it right way back then.




I have this in a spreadsheet, but it isn't pretty to look at, which is the down side of the brute force approach. This method could also be easily adjusted to calculate probabilities for sums of other-sided (4,8,10,12,20, etc) as well. Perhaps I will follow up on that.

Update: See Dice Distributions Revisited.

15 February 2009

Custom Dice

Singapore Longshoremen Battletech Shark Aramis DragoonsOur local Battletech group organized to order some custom dice from Chessex. I say "we" organized, but Jeremy did all the work (Thanks Jeremy!). That's our group emblem to the right. [original artist/source unknown]





Custom Dice Chessex Singapore Longshoremen sharkThese dice have a single custom side, with the six-pips replaced by our mascot hungry shark. They really turned out nice.









Custom Dice Chessex Singapore Longshoremen sharkShiny new dice in hand, I proceeded to try them, and in their debut game ...

... I had absolutely terrible luck.

("Jeremy, I'd like to have a word with you about these dice.")

Maybe they just need more practice. :-)

10 February 2009

Forestry Mechs, Almost.

If only they had legs.


No trees die, but this thing is kind of cool. Watch it climb the hill.


A toy skidder!

09 February 2009

Have Your Cake, and Dragon Too

[from Geekologie] Could you eat a cake like this? It just looks too good to cut up (but they do).

08 February 2009

Game Theory Week: The Homicidal Chaffeur

To most gamers, and probably even most mathematicians, the games in Game Theory don't actually have much to do with the games we like to play for fun. The Curmudgeon Gamer put it very aptly:
In case you aren't aware, "Game Theory" is a bait-and-switch ruse right up there with "Greenland". Somehow they managed to take the field of strategic game-playing and restrict it only to games no one would ever want to play. (Apparently there was some analysis of actual games in there at the beginning, but that was swiftly excised, lest anyone actually enjoy themselves.) [emphasis added]
A fair criticism too, because I looked long and hard in the literature without finding any real mention of the mathematical exploration of games. It turns out I wasn't looking in the right place, but one day a very crafty lady (my wife, naturally) made a good find, and got me a nice birthday present: old book by some guy named Rufus.

Rufus Isaacs Differential GamesRufus Isaacs gave mathematical formulation to real problems such as games of Pursuit and Evasion. In doing so, he opened up a new field of mathematics. His book Differential Games is a classic (available at Amazon.com for under $13!). It probably helps to have a good math background to follow along, because it makes use a lot of use of trigonometry, calculus, and differential equations, but it's written as a discussion and introduction to the problems rather than as a textbook. You can read the first chapter and most of the second on Google Books.

Differential Games Bomber Intercept guarding targetIsaacs did a lot of classified work at the RAND Corporation, are many of the problems he discusses are versions of pursuit and evasion that are actual real military considerations, such as intercepting a bomber before it can reach a target area. These are not simple problems, but it's not too hard to see this sort of problem is not too different from that of playing a game that represent a real military problem.
[Image Google Books]

Rufus Isaacs Differential Games Homocidal ChaffeurOne of the most basic problems in differential game is that of a vehicle trying to intercept a moving target, and Isaacs called this problem (with some humor intended) the "Homicidal Chaffeur". You can find a nice description and short discussion of The Homicidal Chauffeur at Reasonable Deviations.

Rufus Isaacs Differential Games Homocidal ChaffeurDifferential Games go a long way toward taking game theory in the direction of games we know and love to play. It's NOT a simple topic, but we already knew that games are far from simple. It's not surprising - and perhaps even gratifying - to learn the games we play are so fantastically complex. I've actually worked through a few simple examples of discrete differential games that occur in boardgames (well, Battletech anyway). Those examples will probably find their way to becoming blogs posts eventually, but I have some other ground to cover before I tackle differential games for myself.
[Image Google Books]

07 February 2009

Game Theory Week: GAMES!

game theory extensive formI am time challenged today, so I offer a list of suggested browsing on game theory.

From Game Theory.net:

A Chronology of Game Theory

Combinatorial Game Theory (LOTS of links!)

Steve LaValle's Home page, Rapidly-exploring Random Trees, and the RTT Gallery. Not exactly game theory, but the applications are just so cool. (Hint: Start with the gallery!)

More online? games from Dave Rusin.

06 February 2009

Game Theory Week: Pop Culture

Oops. Now I find this post never got published. I'd like to say I'm not always such a space cadet, but it might not be true. Anyway, here it is, one week late:

Game Theory Week: Game Theory in Pop Culture

The Man with No Name says ...
You see in this world there's two kinds of people my friend; those with loaded guns, and those who dig. You dig.


[The build up is slow, so if you are in a hurry skip forward to 4:45]

GameTheory.net has a collection of game theory in film, or at least scenes in films that can be described in terms of game theory. The scene above with Uncle Clint is explained here. There are a number of good reviews here, including Princess Bride which is one of my all time favorites.

05 February 2009

Game Theory Week: All Mixed Up

tossed salad mixed strategyToday you get my home-cooked ramblings on game theory, and I'll be straying a bit from game theory as such to an application that interests me. If you really want to learn more about mixed strategies try the links.

Mixed strategies in game theory fascinate me. The basic premise of a mixed strategy is that sometimes there is no Nash equilibrium, no single choice of strategies between two players that is always the best either can do, without cooperation or coordination of the other player. In these cases there is still an equilibrium, but it exists as a random mix between two (maybe more) choices for each player.

cards poker hand mixed strategyBluffing in poker is a good example of a mixed strategy. Bluff too often and other players will be more likely to call your bluff. Do not bluff enough, and other players will recognize when you are playing a strong hand (It's more complicated than that, because poker involves a lot of signaling too). Play poker with the right mix of bluffing and straight, and do these randomly without tipping other players to your choice, and you will be playing at a Nash equilibrium.

I have a proposition, an agenda for this blog that I am working towards, and I have most of the pieces in place to start discussing it seriously: I think mixed strategies can help describe how people actually play games, and how well they play them.

Typical game theory examples are usually very simple, there are generally 2 or 3 strategies and one of them might be the best (optimal). Games we like to play are not so simple. Consider that a player of boardgame (or card game, or a sport) might have multiple strategies, multiple moves for each piece on the board, and the total number of choice might be very large. Finding the best move might not be a trivial problem (a topic for another day).

Young Battletech players at GENCON 2006Here is the obvious part: more skillful players will be better at picking out good move or strategies. A perfect player would always make the best available choice, but less skillful players will tend to make less-than-optimal choices, possibly because they have imperfect knowledge of how the game works. Information clearly plays an important role, so let's consider an unskilled player, someone new to a game that knows nothing at all and essentially chooses strategies at random. This might also be a computer program that literally plays randomly (Note: Some early versions of computer games such as chess played essentially random games like this.).

Holger Meinhardt game theory mathematicaNow the less obvious: I'm departing from the concept of an optimal mixed strategy, to random play that is very far from optimal. In my own notes I've been referring to this as "random strategy", thought perhaps "blind strategy" is more apt, since the player uses no information about the game. This sort of random strategy might be a yardstick to gauge the ability non-random players; a baseline for comparison. Note that it is possible to play worse-than-random, but to do some requires information about the game and consistent bad choices, and most human players can probably do better than this even on their first try at a new game. [Image by Holger Meinhardt]

Information leads to opportunity for good choice, and good choices lead to good play. Player skill might be interpreted as the ability to take advantage of the available information and choose better strategies. So here is the conclusion of this ramble: Information is a statistically measurable quantity, so it should be possible to consider how much information is available to a player and how well they are able to make use of it (how much better than random). What I'm working towards here is (I think) a statistical measure called a likelihood ratio. A trivial example might be a game with two choices, one winning and the other losing, and the player must use some information to guess the winning choice. The random (blind) player will win 50% of the time, but the informed (yet imperfect) player should win more often than that, let’s say 75% of the time. The likelihood ratio would then be 0.75/0.50 = 1.5.

04 February 2009

Game Theory Week: Getting Schooled

If you want some real education in game theory, then Open Yale Courses has just the thing for you: Game Theory with Professor Ben Polak.

Open Yale Courses Game Theory with Professor Ben Polak
This complete class in available online. Lectures are available in video and MP3 format, blackboard notes in PDF, and text transcripts too (see links in left hand sidebar). This is game theory for the math-phobic, though it does assume you have a passing familiarity with calculus for some of the problems. The lectures generally focus on talking about the theory, and not the math. I'm working my way through these at these and learning a lot. Polak is an excellent teacher and very enthusiastic about what he does.

I intentionally haven't posted direct links to this material here, because the material is too deep for casual browsing. I've been (slowly) working my way through these at the rate of a few per week. The lectures run about 90 minutes, and I find it's nice to have the blackboard notes and transcript handy to help understand the tricky bits. I'm really looking forward to some of the more advanced topics.

03 February 2009

Game Theory Week: Barry Nalebuff

Barry Nalebuff (1 2) is a Professor ... excuse me ... the Milton Steinbach Professor at Yale School of Management. He is an expert in business strategy and game theory, author of a numbers of books, and is rumored to run a 9.86 second 100-meter dash (that last one is a rumor I am starting). He also has some video comments on BigThink. In this first one he talks a bit about the history of game theory:









video platform
video management
video solutions
free video player


Next up, Dr. Nalebuff tell us how game theory can be applied to relationships:









video platform
video management
video solutions
free video player


Barry Nalebuff on Game Theory and Poker --- (This one is for Brandon):









video platform
video management
video solutions
free video player


More video ideas from Barry Nalebuff can be found on BigThink.

02 February 2009

Game Theory Week: That's Cheating!

Michael Shermer, champion skeptic and former competitive bicycle racer, has an article in the January issue of Scientific American:

A recent comment by Jason about "cheat Scrabble" reminded me of this SciAm article that gives a nice explanation of why people cheat in terms of game theory, and sparked my idea for this series of posts (Thanks Jason!). Here are a few notes I grabbed from the key concepts:
  • Game theory highlights why it is rational for professional cyclists to dope: the drugs are extremely effective as well as difficult or impossible to detect; the payoffs for success are high; and as more riders use them, a “clean” rider may become so noncompetitive that he or she risks being cut from the team.
Roughly speaking, when the payoff for winning far exceeds the penalty for cheating, "clean" athletes are pushed into cheating in order to keep up with those who cheat. While it is quite a different situation among the tabletop gamers, there are rumors of players banned from convention events, so it is not unheard of. It is far more common that people might have selective memories about what rules get applied and which dice rolls get modifiers, so gaining a small advantage at a critical moment. In games with quite complicated rule sets like I generally play, this sort of mistake might happen to anyone, and is not generally classified as cheating, but making too many mistakes makes gaming buddies suspicious, and so there is a social penalty for incorrect play.
  • The game theory analysis of cycling can readily be extended to other sports. The results show quantitatively how governing bodies and antidoping agencies can most effectively target efforts to clean up their sports.
Game theory can show how the problem of cheating might be solved, by changing the payoffs so that honest play gains the athlete (or player) the greater benefit. Social groups are actually pretty good at enforcing "good behavior", but professional cycling (and maybe some other sports) have some problems to fix.

01 February 2009

Game Theory Week: Intro

So far in these posts ramblings about the mathematics of games my approach has been to consider simpler versions of the game. There is a reason for this; I have been avoiding the difficult problem of player choice. In every post so far choice is either reduced to a near trivial (Scrabble --> Nim), removed entirely from consideration (Monopoly --> random walk), or not present in the first place (Candyland --> Markov chain). By simplifying choice out of these games, my task my task becomes much easier, but I'm also missing out on the best part of most games, which is the interaction between players seeking the winning strategy.

Doctor McCoy Staticician not EconomistThe time has come for that to change. To get at this problem I need a new tool, and that tool is Game Theory. The only problem is, I don't actually know that much about it ("Dammit Jim, I'm a Statistician, not an Economist!"). I do know some things; I have been reading and working to educate myself on this topic, but it is not my expertise and I feel a bit under-confident writing about it. Fortunately for me I don't have to be an expert. As a blogger I can point out some interesting uses of game theory and share the good sources that I am learning from.

dice game theoryOver the next week or so I will have a series of posts on game theory and related topics. My task will be to try to link game theory back to actual game people play. I will be leaning heavily on the work of others, but I hope to contribute some of my own thoughts as well.