Key Takeaways
1. Game Theory Analyzes Strategic Interdependence.
At its core, game theory is the study of strategic interdependence—that is, situations where my actions affect both my welfare and your welfare and vice versa.
Interdependence is key. Game theory models situations where players' outcomes depend not just on their own actions, but also on the actions of others. This requires players to anticipate, act, and react strategically, moving beyond simple decision-making in isolation.
The Prisoner's Dilemma. The classic example is the Prisoner's Dilemma, where two arrested thieves must decide whether to confess or stay quiet. Their individual sentences depend on the other's choice, leading to a situation where the individually rational choice (confess) leads to a worse collective outcome than if they had both stayed quiet.
Applications abound. This fundamental concept applies to diverse scenarios:
- Countries deciding whether to attack or defend.
- Firms choosing whether to advertise.
- States engaging in arms races or trade tariffs.
Understanding strategic interdependence is the first step to analyzing such complex interactions.
2. Strict Dominance Simplifies Choices.
We say that a strategy x strictly dominates strategy y for a player if strategy x provides a greater payoff for that player than strategy y regardless of what the other players do.
Always a better option. A strictly dominated strategy is one that a rational player would never choose, because another strategy always yields a higher payoff, no matter what the other players do. Identifying and eliminating these strategies simplifies the game.
Prisoner's Dilemma example. In the Prisoner's Dilemma, confessing strictly dominates staying quiet for each player. Regardless of the other player's choice, confessing leads to a shorter jail sentence. This makes the decision straightforward for a self-interested prisoner.
Rational players avoid. By definition, playing a strictly dominated strategy is irrational. A rational player will always choose the dominating strategy, as it guarantees a better outcome regardless of the opponent's move.
3. Iterated Elimination Refines Strategy Sets.
Iterated elimination of strictly dominated strategies simplifies games by removing strictly dominated strategies—strategies that players would never play.
Unraveling complexity. In games where no single strategy strictly dominates all others, players can still make inferences based on what others won't play. By iteratively removing strictly dominated strategies, the game can be reduced, sometimes to a single outcome.
Club game example. In the game between two dance clubs, one club had a strictly dominant strategy (salsa). The other club, knowing this, could eliminate the dominated strategy and then make its optimal choice (disco) based on the reduced game.
Order doesn't matter. A key property of iterated elimination of strictly dominated strategies is that the final outcome does not depend on the order in which the dominated strategies are removed. This provides a robust method for solving certain games.
4. Nash Equilibrium: The "No Regrets" Outcome.
A Nash equilibrium is a set of strategies, one for each player, such that no player has incentive to change his or her strategy given what the other players are doing.
Mutual best response. In a Nash Equilibrium, each player's chosen strategy is the best possible response to the strategies chosen by the other players. If players play according to a Nash Equilibrium, no individual player would regret their choice after seeing what others did.
Stag Hunt example. The Stag Hunt game has two pure strategy Nash Equilibria: both hunt the stag, or both hunt hares. In either case, neither player can unilaterally switch strategies and improve their outcome, given the other's choice.
Not always efficient. Nash Equilibria do not guarantee the best collective outcome. In the Stag Hunt, both hunting hares is a Nash Equilibrium, but both hunting the stag yields a higher payoff for both players. This highlights potential coordination failures.
5. Mixed Strategies Handle Uncertainty and Indifference.
As it turns out, every finite game has at least one Nash equilibrium.
Nash's Theorem. This fundamental theorem guarantees that every game with a finite number of players and strategies has at least one Nash Equilibrium, which may involve players randomizing their choices (mixed strategies). Games without pure strategy Nash Equilibria, like Matching Pennies, must have mixed strategy equilibria.
Randomization is key. A mixed strategy involves a player choosing between their pure strategies with certain probabilities. In a mixed strategy Nash Equilibrium, these probabilities make the opponent indifferent between their own pure strategies, preventing them from exploiting any predictable pattern.
Calculating probabilities. Finding mixed strategy Nash Equilibria involves setting the expected utilities of a player's pure strategies equal to each other, given the opponent's mixed strategy, and solving for the probabilities. This ensures the necessary indifference for mixing.
6. Sequential Games Use Game Trees and Backward Induction.
We call these types of games sequential games, since the order of play follows a sequence.
Order matters. Unlike simultaneous move games represented by matrices, sequential games involve players moving in a specific order, with later players often observing earlier moves. These are best represented by game trees (extensive form).
Game trees visualize flow. A game tree shows decision nodes, branches representing choices, and terminal nodes with final payoffs. This structure allows for analyzing the game step-by-step, considering the information available to players at each turn.
Backward induction solves. The primary method for solving sequential games without simultaneous moves is backward induction. This involves starting at the end of the game, determining the optimal moves for the last players, and using that information to determine optimal moves for players earlier in the sequence.
7. Subgame Perfect Equilibrium Filters Incredible Threats.
Subgame perfection ensures that players only believe threats that others have incentive to carry out when it is time to execute those threats.
Credibility is crucial. While a simultaneous move game might have multiple Nash Equilibria, some may rely on threats that are not credible in a sequential setting. A Subgame Perfect Equilibrium (SPE) is a Nash Equilibrium that remains a Nash Equilibrium in every subgame of the original game.
Selten's Game example. In Selten's game (or the firm entry game), one Nash Equilibrium involves a firm threatening a price war if a competitor enters. However, if entry actually occurs, the firm would prefer to accede than start a costly war. The threat is incredible, and that Nash Equilibrium is not subgame perfect.
Backward induction finds SPE. Backward induction naturally identifies SPE because it forces players to consider their optimal move at every decision node, regardless of whether that node is reached in equilibrium. This eliminates strategies based on non-credible threats.
8. Commitment Problems Hinder Mutually Beneficial Outcomes.
An important element of a commitment problem is the time inconsistency issues a player faces.
Inability to commit. A commitment problem arises when players cannot credibly commit to a future course of action, even if that commitment would lead to a better outcome for everyone involved. The player's incentives change over time, making their initial promise unreliable.
Police search example. In the police search game, the officer wants you to allow a quick search (mutually beneficial). However, once you consent, his incentive shifts to conducting a more extensive search. Knowing this, you cannot trust his promise and opt for the less desirable K-9 unit outcome.
Contracts and burning bridges. Solutions to commitment problems often involve external enforcement (like contracts in the Wild West example) or strategically limiting future options (like burning bridges in the military example) to make threats or promises credible.
9. Generalized Games Reveal Universal Strategic Patterns.
If we are going to encounter many different versions of battle of the sexes, it would help if we could derive a simple formula for the mixed strategy Nash equilibrium.
Beyond specific numbers. Replacing specific numerical payoffs with variables allows for analyzing entire classes of games simultaneously. This reveals underlying strategic structures and how equilibria depend on the relative values of payoffs, not just their absolute numbers.
Generalized Battle of the Sexes. By using variables (A, B, C, a, b, c) to represent preferences, a single calculation can derive the mixed strategy formula applicable to any version of Battle of the Sexes, as long as the preference ordering holds.
Identifying contradictions. Generalized games also help prove the non-existence of certain equilibria. In the generalized Prisoner's Dilemma or Deadlock, attempting to solve for a mixed strategy using variables leads to mathematical contradictions, confirming that no such equilibrium exists for any valid payoff configuration.
10. Comparative Statics Analyze How Strategic Changes Impact Outcomes.
At its core, game theory is the study of altering the strategic dimensions of an environment.
Measuring sensitivity. Comparative statics examine how changes in exogenous variables (like costs, values, or probabilities) affect the game's equilibria and outcomes. This helps predict how manipulating the strategic environment impacts player behavior and welfare.
Penalty kicks example. Analyzing the penalty kick game with a variable representing kicker accuracy shows a counterintuitive result: as the kicker's weak side improves, they kick to that side less often in equilibrium, because the goalie adjusts their strategy.
Policy implications. Comparative statics are crucial for policy design. Understanding how changing parameters (e.g., the cost of conflict in the Hawk-Dove game, the cost of calling in the Volunteer's Dilemma) influences equilibrium behavior allows for predicting the effects of interventions.
11. Mixed Strategy Support Requires Indifference.
In a general game, suppose the players mix in equilibrium. Then we immediately know something about the pure strategies in the support of the players’ mixed strategies: they all yield the same expected utility in equilibrium.
Indifference is necessary. For a player to be willing to randomize between multiple pure strategies in a mixed strategy Nash Equilibrium, each of those strategies must yield the exact same expected utility, given the opponent's strategy. If one strategy offered a higher payoff, the player would choose it with certainty.
Support definition. The "support" of a mixed strategy refers to the set of pure strategies that are played with positive probability. Strategies not in the support are played with zero probability.
Weak dominance shortcut. This principle provides a shortcut: if an opponent is mixing among all their strategies, a player cannot include a weakly dominated strategy in their own mixed strategy support, because the dominating strategy would yield a strictly higher expected utility.
12. Rock-Paper-Scissors Highlights the Need for Formal Solutions.
Although you can likely guess that equilibrium, even slight changes to the payoffs quickly makes guessing the solution prohibitively difficult.
Intuition vs. rigor. The basic Rock-Paper-Scissors game has an intuitive mixed strategy Nash Equilibrium (randomize evenly). However, this intuition breaks down with even minor changes to the payoffs, demonstrating the need for formal methods to derive equilibria.
Zero-sum symmetry shortcut. For symmetric, zero-sum games like Rock-Paper-Scissors, a useful shortcut is that each player's expected utility in equilibrium must be zero. This helps rule out potential mixed strategies that don't meet this condition.
Generalized solution. By using variables for payoffs and applying the indifference principle, a general formula for the mixed strategy Nash Equilibrium of any Rock-Paper-Scissors variant can be derived, providing a rigorous solution beyond simple guessing.
Last updated:
Review Summary
Game Theory 101: The Complete Textbook receives mixed reviews, with an average rating of 3.74 out of 5. Readers appreciate it as a good introduction to game theory, praising its accessibility and clear explanations. The book is commended for its step-by-step approach and numerous examples. However, some criticize it for lack of practice problems, errors in the text, and occasional oversimplification. Many recommend using it alongside the author's YouTube series for a better understanding. Despite its flaws, it's generally considered a valuable resource for those new to game theory.
Similar Books








