10 December 2010

Dice and Information ... So What?

After I finished my previous post on Dice and Information I had the thought, "These are some pretty neat numbers, but So What? What's it good for?"

One way to look at information is as a measure of uncertainty in the game, or at least the uncertainty in the outcome of a single move that is part of a larger game (The information in an entire game, start to finish, is a matter for another day.). Consider the game of Chess, which has nothing random about it. There is no information at all in a chess move, your just make your move, perhaps taking another piece, and it always works. Suppose now we change Chess so that it requires an attack roll if you want to take another piece, with a 50% chance of success (put the piece back where it started if the attack fails). Now there is uncertainty with each move (1 bit of entropy) and the outcome of any move is far from certain. We might change this to 90% success, which works out to about 0.08 bits* (oops. Thanks Bradley!) about 0.47 bits* of entropy per attack. This means that any attack is nearly certain quite likely to succeed. If the chance of success is 10%, then the entropy is again 0.08 0.47 bits*, and the attack is nearly certain likely to fail.

So entropy is measuring uncertainty in the sense of the predictability of results, but NOT the predictably of  a preferred result, such as a successful attack roll.

* It might help to think of 10% or 90% success as the flip of an unbalanced coin. Information is maximized, and the result is most uncertain, when the coin is fair (50% success).


This post hasn't gone the direction I thought it would - I was thinking (incorrectly) I could describe Entropy as a measure of the uncertainty in the outcome, but this is rather different. Consider a player making three to-hit rolls in a game, at 90%, 50%, and 10% success. The first (90%) has just a little entropy (0.47 bits) and the outcome is quite likely. The player has a high degree of control, because the decision to attack is very likely to succeed. At 50% the entropy of this roll is maximized at 1 bit, and the player will be most uncertain of the result either way. At 10% entropy is again 0.47 bits, but the player is very likely NOT to succeed. Now the player has very little control, or very little influence on the outcome of the game (with this single roll).

Back to the drawing board? The paragraph above hits a few rough spots, because Entropy and player control out outcomes in a game are maybe two different things. This gives me something new to think about.

...

And before I can finish hammering the bugs out of this post, Ashley has her response up over at Paint-it-Pink. At risk of quoting Ashley out-of-context ...

Ashley: Now when one applies modifiers to a 2D6 roll, if one knows that it only encodes 3.27 bits, then a modifier of one is equivalent to one bit. If I'm correct then a modifier of three is equal to three bits of information, which has the effect of reducing surprise in the diced for result? Such modification of the base 2D6 roll is therefore highly significant, which seems to me to support my proposition about the modifiers for targeting computers and pulse lasers in the Battletech game being too coarse.
Not quite right, but Ashley's intuition is basically correct. We need to sort this out a bit. First, the modifiers she mentions are for a Battletech "To-Hit" roll with Hit-or-Miss outcomes, and this is the same situation as flipping an unbalance coin. As in my "battle chess" example above, modifying this roll doesn't necessarily decrease the information. Second, the 3.27 bits a for a 2d6 hit-location roll** is separate from the To-Hit roll. However, we might combine the To-Hit and Hit-Location rolls by considering a "miss" to be a no-location result and grouping it with actual location results.

** Irrelevant quibble: this is actually about 3.0, because there are multiple ways to roll "Arm" hits.

Here is a new table, similar to my earlier table where I calculated the Entropy of a 2d6 result, but now the possibility of a miss of a 2d6 roll of 7 or less.



Now suppose the To-Hit roll is more difficult. Ashley's intuition says the entropy ought to decrease. Here's another table with a "12" needed to-hit.


Sure enough, the entropy has decreased. Unlike the battle Chess example, here the Entropy of hit-location (including no-location) will usually decrease with more difficult to-hit rolls (it hits maximum entropy with a to-hit roll of 3+).

Lesson learned: When thinking about entropy, it is important to include all possible outcomes of the random result.

Somehow I think this topic is not done yet, but that is all I have time for today.

A small update (12/12/2010, 4 PM): I just made the following comment on Ashley's blog, and I'm copying it here so I might remember to come back to the idea later.
Now here is a brain bender - Suppose you could roll one set of dice to resolve a whole turn of Battletech play, or a whole game - How much information would be in that? I don't know myself, but I'll think on it. My intuition is a simpler game should have less total information than a complex one, but I'm not sure yet what it would mean to compare them.

5 comments:

 Ashley said...

Okay, I've just read that and now my brain hurts again. I'll need to sleep on it and hopefully it will all sink in at a subconscious level, allowing me to gain a better understanding of the topic.

DevianID said...

Quick question, how did you compute that a 90% chance of success was .08 bits. I figured 90%'s reciprocal was 1.111, and crunched log(1.111)/log(2) and got .152 bits of info.

I am working with only the microsoft calc hence it only works log in base 10 AFAIK, which is why I used the divisor conversion. Does it not work for suprise in this way?

Dan Eastwood said...

You are correct! Thanks for point out my error. In my example I originally used 99% and 1% chance of success, for which the entropy is 0.08. I later changed the example not to be quite so extreme, but didn't update my calculations (Bad statistician! No Biscuit!!).

I corrected the post, and change some of the wording describing those probabilities.

DevianID said...

My example of .152 bits was wrong too. I forgot that there is not .9 events, there are 2 events with a 90% and a 10% chance of probability, which need to be added to find the total entropy. Such a learning experience this entropy is!

As for your brain bender, I answered over on Pinks site, but basicly if the battletech result was to be determined by a flip of a coin, then the information in the battletech game would only have 1 bit. That one bit might carry a lot of consequences, but its still 1 bit. Kind of like every coin toss in football only has 1 bit, but that one bit of information will represent different things for an Eagles/Giants game compared to a Stealers/Patriots game.

Dan Eastwood said...

Your weren't wrong, you just weren't done yet. :-)

Now to go check out Paint-it-Pink.