One way to look at information is as a measure of uncertainty in the game, or at least the uncertainty in the outcome of a single move that is part of a larger game (The information in an entire game, start to finish, is a matter for another day.). Consider the game of Chess, which has nothing random about it. There is no information at all in a chess move, your just make your move, perhaps taking another piece, and it always works. Suppose now we change Chess so that it requires an attack roll if you want to take another piece, with a 50% chance of success (put the piece back where it started if the attack fails). Now there is uncertainty with each move (1 bit of entropy) and the outcome of any move is far from certain. We might change this to 90% success, which works out to about
So entropy is measuring uncertainty in the sense of the predictability of results, but NOT the predictably of a preferred result, such as a successful attack roll.
* It might help to think of 10% or 90% success as the flip of an unbalanced coin. Information is maximized, and the result is most uncertain, when the coin is fair (50% success).
This post hasn't gone the direction I thought it would - I was thinking (incorrectly) I could describe Entropy as a measure of the uncertainty in the outcome, but this is rather different. Consider a player making three to-hit rolls in a game, at 90%, 50%, and 10% success. The first (90%) has just a little entropy (0.47 bits) and the outcome is quite likely. The player has a high degree of control, because the decision to attack is very likely to succeed. At 50% the entropy of this roll is maximized at 1 bit, and the player will be most uncertain of the result either way. At 10% entropy is again 0.47 bits, but the player is very likely NOT to succeed. Now the player has very little control, or very little influence on the outcome of the game (with this single roll).
Back to the drawing board? The paragraph above hits a few rough spots, because Entropy and player control out outcomes in a game are maybe two different things. This gives me something new to think about.
And before I can finish hammering the bugs out of this post, Ashley has her response up over at Paint-it-Pink. At risk of quoting Ashley out-of-context ...
Ashley: Now when one applies modifiers to a 2D6 roll, if one knows that it only encodes 3.27 bits, then a modifier of one is equivalent to one bit. If I'm correct then a modifier of three is equal to three bits of information, which has the effect of reducing surprise in the diced for result? Such modification of the base 2D6 roll is therefore highly significant, which seems to me to support my proposition about the modifiers for targeting computers and pulse lasers in the Battletech game being too coarse.Not quite right, but Ashley's intuition is basically correct. We need to sort this out a bit. First, the modifiers she mentions are for a Battletech "To-Hit" roll with Hit-or-Miss outcomes, and this is the same situation as flipping an unbalance coin. As in my "battle chess" example above, modifying this roll doesn't necessarily decrease the information. Second, the 3.27 bits a for a 2d6 hit-location roll** is separate from the To-Hit roll. However, we might combine the To-Hit and Hit-Location rolls by considering a "miss" to be a no-location result and grouping it with actual location results.
** Irrelevant quibble: this is actually about 3.0, because there are multiple ways to roll "Arm" hits.
Here is a new table, similar to my earlier table where I calculated the Entropy of a 2d6 result, but now the possibility of a miss of a 2d6 roll of 7 or less.
Now suppose the To-Hit roll is more difficult. Ashley's intuition says the entropy ought to decrease. Here's another table with a "12" needed to-hit.
Sure enough, the entropy has decreased. Unlike the battle Chess example, here the Entropy of hit-location (including no-location) will usually decrease with more difficult to-hit rolls (it hits maximum entropy with a to-hit roll of 3+).
Lesson learned: When thinking about entropy, it is important to include all possible outcomes of the random result.
Somehow I think this topic is not done yet, but that is all I have time for today.
A small update (12/12/2010, 4 PM): I just made the following comment on Ashley's blog, and I'm copying it here so I might remember to come back to the idea later.
Now here is a brain bender - Suppose you could roll one set of dice to resolve a whole turn of Battletech play, or a whole game - How much information would be in that? I don't know myself, but I'll think on it. My intuition is a simpler game should have less total information than a complex one, but I'm not sure yet what it would mean to compare them.