Game Theory (Part 4)

John Baez

Last time we talked about Nash equilibria for a 2-player normal form game. We saw that sometimes a Nash equilibrium doesn't exist! Sometimes there's more than one! But suppose there is at least one Nash equilibrium. How do we find one? Sometimes it's hard.

Strict domination

But there's a simple trick to rule out some possibilities. Sometimes player A will have a choice \( i\) that gives them a bigger payoff than some choice \( i'\) no matter what choice player B makes. In this case we say choice \( i\) strictly dominates choice \( i'.\) And in this case, there's no way that \( i'\) could be A's choice in a Nash equilibrium, because player A can always improve their payoff by switching to choice \( i.\)

For example, look at this game:

1 2
1    (0,0)   (-1,1)  
2 (2,-1)   (0,0)  
3 (-2,1)   (1,-1)  
4 (0,1)   (2,0)  

For player A, choice 4 strictly dominates choice 3. No matter what player B does, player A gets a bigger payoff if they make choice 4 than if they make choice 3. Stare at the black numbers in the table in rows 3 and 4 until you see this.

You can also see it using the payoff matrix for player A:

\( A = \left( \begin{array}{rr} 0 & -1 \\ 2 & 0 \\ -2 & 1 \\ 0 & 2 \end{array} \right) \)

Each number in row 4 here is bigger than the number directly above it in row 3. In other words:

\( 0 = A_{41} > A_{31} = -1\)

and

\( 2 = A_{42} > A_{32} = 0\)

Puzzle. Are there any other examples of a choice for player A that strictly dominates another choice?

And of course the same sort of thing works for player B. If player B has a choice \( j\) that gives them a better payoff than some choice \( j'\) no matter what player A does, then \( j'\) can't show up as B's choice in a Nash equilibrium.

We can make our remarks more precise:

Definition. A choice \( i\) for player A strictly dominates a choice \( i'\) if

\( A_{ij} > A_{i'j} \)

for all \( j.\) Similarly, a choice \( j\) for player B strictly dominates a choice \( j'\) if

\( B_{ij} > B_{ij'} \)

for all \( i'.\)

Theorem 1. If a choice \( i\) for player A strictly dominates a choice \( i',\) then no choice \( j\) for player B gives a Nash equilibrium \( (i',j).\) Similarly, if a choice \( j\) for player B strictly dominates a choice \( j',\) then no choice \( i\) for player A gives a Nash equilibrium \( (i,j').\)

Proof. We'll only prove the first statement since the second one works the exact same way. Suppose \( i\) strictly dominates \( i'.\) This means that

\( A_{i j} > A_{i' j}\)

for all \( j.\) In this case, there is no way that the choice \( i'\) can be part of a Nash equilibrium. After all, if \( (i',j)\) is a Nash equilibrium we have

\( A_{i' j} \ge A_{i'' j}\)

for all \( i''\). But this contradicts the previous inequality — just take \( i'' = i.\) █

Strict dominance

Definition. We say a choice is strictly dominant if it strictly dominates all other choices for that player.

Let me remind you again that a pure strategy is a way for one player to play a game where they always make the same choice. So, there is one pure strategy for each choice, and one choice for each pure strategy. This means we can be a bit relaxed about the difference between these words. So, what we're calling strictly dominant choices, most people call strictly dominant pure strategies. You can read more about these ideas here:

Strategic dominance, Wikipedia.

We've seen that if one choice strictly dominates another, that other choice can't be part of a Nash equilibrium. So if one choices strictly dominates all others, that choice is the only one that can be part of a Nash equilibrium! In other words:

Theorem 2. If a choice \( i\) for player A is strictly dominant, then any Nash equilibrium \( (i',j')\) must have \( i' = i\). Similarly, if a choice \( j\) for player B is strictly dominant, then any Nash equilibrium \( (i',j')\) must have \( j' = j\).

Proof. We'll only prove the first statement since the second one works the exact same way. Theorem 1 says that if a choice \( i\) for player A strictly dominates a choice \( i',\) then no choice \( j\) for player B gives a Nash equilibrium \( (i',j).\) So, if \( i\) strictly dominates all other choices for player A, it is impossible to have a Nash equilibrium \( (i',j)\) unless \( i' = i.\) █

We can go even further:

Theorem 3. If choice \( i\) for player A is strictly dominant and choice \( j\) for player B is strictly dominant, then \( (i, j)\) is a Nash equilibrium, and there is no other Nash equilibrium.

Proof. Suppose choice \( i\) for player A is strictly dominant and choice \( j\) for player B is strictly dominant. Then by Theorem 2 any Nash equilibrium \( (i',j')\) must have \( i' = i\) and \( j' = j\). So, there is certainly no Nash equilibrium other than \( (i, j).\)

But we still need to check that \( (i, j)\) is a Nash equilibrium! This means we need to check that:

1) For all \( 1 \le i' \le m,\) \( A_{i'j} \le A_{ij}.\)

2) For all \( 1 \le j' \le m,\) \( B_{ij'} \le B_{ij}.\)

Part 1) is true because choice \( i\) is strictly dominant. Part 2) is true because choice \( j\) is strictly dominant. █

We can restate Theorem 3 is a less precise but rather pretty way:

Corollary. If both players have a strictly dominant pure strategy, there exists a unique Nash equilibrium.

Here I'm saying 'pure strategy' instead of 'choice' just to prepare you for people who talk that way!

Domination

I realize that the terminology here has a kind of S&M flavor to it, with all this talk of 'strict domination' and the like. But there's nothing I can do about that—it's standard!

Anyway, there's also a less strict kind of dominance:

Definition. A choice \( i\) for player A dominates a choice \( i'\) if

\( A_{ij} \ge A_{i'j} \)

for all \( j.\) Similarly, a choice \( j\) for player B dominates a choice \( j'\) if

\( B_{ij} \ge B_{ij'} \)

for all \( i.\)

We can't do as much with this definition. Why? Here's why:

Puzzle 1. Find a normal-form two player game with 2 choices for each player, where \( (i,j)\) is a Nash equilibrium even though \( i\) is dominated by the other choice for player A and \( j\) is dominated by the other choice for player B.

However, we can still prove this:

Theorem 4. If choice \(i\) for player A is dominant and choice \(j\) for player B is dominant, then \((i,j)\) is a Nash equilibrium.

I'll make you prove this in your homework. Notice the difference between this theorem and Theorem 3! If either choice \(i\) or choice \(j\) is dominant but not strictly dominant, \((i,j\)) may not be the only Nash equilibrium.

Puzzle 2. Find a 2-player normal form game where choice \(i\) for player A is strictly dominant, choice \(j\) for player B is dominant, but \((i,j)\) is not the only Nash equilibrium.


You can also read comments on Azimuth, and make your own comments or ask questions there!


© 2013 John Baez
baez@math.removethis.ucr.andthis.edu
home