Key Takeaways
1. Groups Often Fail, Amplifying Individual Errors.
In fact, individual errors are not merely replicated but actually amplified in many group decisions—a process of “some garbage in, much garbage out.”
Individual biases persist. Behavioral science shows individuals make predictable mistakes like unrealistic optimism, planning fallacy, and overconfidence, often relying on fast, intuitive "System 1" thinking. Groups, surprisingly, don't reliably correct these. Instead, they often make them worse.
Groups amplify errors. When individuals are prone to biases, group discussion frequently amplifies those same biases. For example, groups are often more susceptible to the sunk-cost fallacy and framing effects than individuals. They can also be worse at the planning fallacy, being even more unrealistically optimistic about project timelines.
System 1 dominates. Within groups, the rapid, emotional, and intuitive System 1 thinking often holds sway. This prevents the slower, deliberative System 2 from acting as a safeguard, leading groups to double down on individual mistakes rather than correcting them through rational analysis.
2. Informational & Social Pressures Drive Group Failures.
As a result of these two types of influences, groups run into four independent problems.
Informational signals influence. People often silence their private knowledge because they infer from what others say or do that their own information must be wrong or less valuable. If early speakers favor a view, others may defer, assuming those who spoke first have good reasons, even if they don't.
Social pressures silence. Group members also stay quiet to avoid disapproval or penalties from leaders or peers. They don't want to seem foolish, disagreeable, or not part of the team, especially if the dominant view seems clear. This pressure is stronger when leaders express strong opinions early or when the group is cohesive.
Combined effect is powerful. Both informational and social pressures lead individuals to withhold crucial, unshared information. This prevents the group from accessing the full pool of knowledge held by its members, undermining the potential benefits of collective intelligence and leading to poor decisions.
3. Cascades Lead Groups Astray by Silencing Private Knowledge.
It is no exaggeration to say that herding is the fundamental behavior of human groups.
Following the herd. Cascades occur when people follow the statements or actions of those who came before, ignoring their own private information. This can happen due to informational signals (believing others are right) or reputational pressures (fearing disapproval).
Information is lost. In a cascade, individuals don't reveal what they truly know or think. This means the group's final decision may not reflect the aggregate knowledge of its members, even if that knowledge would point to a better outcome. Early, potentially erroneous, signals can set the group on a bad path.
Examples of cascades:
- Jury members following early votes despite private doubts.
- Companies launching products based on early positive buzz, ignoring internal skepticism.
- Political decisions influenced by initial public or expert opinions.
Cascades prevent groups from accessing the dispersed information they need, often leading to confident but mistaken consensus.
4. Group Polarization Pushes Like-Minded People to Extremes.
As a general rule, deliberating groups tend to end up adopting a more extreme position in line with their inclinations before they started to talk, and a major effect of deliberation is to squelch internal diversity—and thus to push different groups apart.
Moving to extremes. When a group of like-minded people deliberates, their average opinion tends to become more extreme in the direction of their initial inclination. Groups initially favoring risk become riskier; those favoring caution become more cautious.
Why polarization happens:
- Informational: Arguments presented in the group are skewed towards the initial predisposition, exposing members to more reasons supporting that view.
- Social: Members adjust their positions slightly to align with the perceived group norm or to present themselves favorably.
- Confidence: Agreement from others increases individual confidence, leading people to abandon tentative positions and adopt more extreme ones.
Increased consensus & division. Polarization increases consensus within like-minded groups but widens the gap between groups with different initial leanings. This can be seen in political deliberation, where discussion makes liberals more liberal and conservatives more conservative, increasing societal division.
5. Groups Overlook Crucial Unshared Information (Hidden Profiles).
Unfortunately, countless studies demonstrate that this regrettable result is highly likely.
Shared knowledge dominates. Groups tend to focus disproportionately on information that is already known by all or most members, neglecting crucial information held by only one or a few. This is known as the common-knowledge effect.
Hidden profiles missed. As a result, groups often fail to uncover "hidden profiles"—accurate understandings or optimal solutions that could be reached if all members' information were pooled and considered. This happens even when the unshared information is sufficient to point to a clearly better decision.
Why unshared info is ignored:
- Statistically less likely to be mentioned.
- Less likely to be repeated or discussed if mentioned.
- Individuals may be reluctant to emphasize unique information, especially if it contradicts shared views or if they have low status.
- Discussing shared information makes members seem more competent and likable.
This failure to leverage unique knowledge means groups often perform no better, and sometimes worse, than the average of their individual members before discussion.
6. Anxious & Inquisitive Leaders Improve Group Decisions.
Anxious people like DeParle and Zients are indispensable in business and government, because they cut through, and overcome, the risk of groupthink.
Leaders set the tone. Leaders play a crucial role in counteracting group failures. They can signal that they value diverse information and perspectives, even if it means hearing bad news or dissenting opinions.
Anxiety is productive. Leaders who are anxious about potential problems and worst-case scenarios are more likely to ask probing questions and push for critical evaluation. This anxiety can be contagious, encouraging others to voice concerns and look for flaws.
Self-silencing leaders help. Leaders can also improve information flow by speaking last or tentatively, rather than stating a firm view at the outset. This prevents their status from inadvertently suppressing dissenting opinions or unique information held by lower-status members.
7. Assigning Roles & Changing Perspectives Unlocks Information.
This is a profound story, because Grove was able to shock himself out of his routine thought processes by a purely hypothetical role assignment, in which he asked Moore, and himself, what a new CEO would do.
Division of labor. Assigning specific roles or areas of expertise to group members can significantly improve information sharing. When members know that others have unique, relevant information, they are more likely to seek it out and value it.
Public identification matters. Studies show that publicly identifying who holds what expertise before deliberation begins is more effective than simply having individuals privately know their own expertise. This makes the group aware of its collective knowledge pool.
Perspective shifts break traps. Asking group members to adopt a different perspective, such as imagining what a new leader or an outsider would do, can help break through cognitive biases and groupthink. This creates critical distance and encourages fresh thinking.
8. Real Dissent (Not Just Devil's Advocates) is Vital.
The lesson is that if devil’s advocacy is to work, it will be because the dissenter actually means what she is saying.
Authentic dissent is powerful. Genuine dissenting views, even from a minority, can significantly improve group performance by challenging assumptions, introducing new information, and stimulating critical thinking. Diversity of thought is key.
Devil's advocates have limits. While appointing a devil's advocate seems like a good idea, research shows it's often less effective than real dissent. Assigned dissent can be seen as artificial or a game, and the advocate may not argue as zealously as someone with a sincere belief.
Encourage genuine challenge. Groups should foster a culture that encourages authentic disagreement and critical questioning, rather than relying on formal, potentially insincere, roles. Red teams, composed of members genuinely tasked with finding flaws, are often more effective than single devil's advocates.
9. Separate Idea Generation from Critical Selection.
For almost any problem-solving system, it is better to separate selection, with its emphasis on critical evaluations, from identification, which is best served by an uncritical, open-minded attitude.
Two distinct stages. Effective problem-solving involves two different processes: identifying potential solutions (divergent thinking) and selecting the best one (convergent thinking). Mixing these stages, especially by being overly critical during identification, hinders creativity.
Identification needs openness. The identification stage requires broad, open-minded exploration and generation of many ideas without immediate judgment. Methods like brainstorming, searching existing solutions, or seeking external input are valuable here.
Selection needs rigor. The selection stage requires critical evaluation based on predefined criteria. This is where analytical thinking, data analysis (like cost-benefit analysis), and rigorous testing are essential to choose the optimal solution from the generated set.
10. Statistical Crowds Can Be Wise, But Only Under Specific Conditions.
The central point is that groups will do better than individuals, and big groups better than little ones, so long as two conditions are met: majority rule is used and each group member is more likely than not to be correct.
Averaging can be accurate. The Condorcet Jury Theorem and the Law of Large Numbers show that the average or majority opinion of a group can be surprisingly accurate, especially for factual questions. This works best when individual judgments are independent and, on average, better than random chance.
Diversity improves accuracy. The accuracy of statistical averages is enhanced by diversity among individuals, whether in information, perspective, or approach. Diverse errors tend to cancel each other out when averaged, leading to a more accurate collective estimate.
The dark side of averages. However, if most individuals are systematically biased or more likely to be wrong than right, the average or majority opinion of a large group will also be systematically wrong. Relying on averages is dangerous when the crowd is misinformed or prejudiced.
11. Don't Chase the Expert; Aggregate Diverse Expert Views.
Unfortunately, very few so-called experts can make good predictions.
Expertise is rare. True expertise, defined as the ability to make consistently accurate predictions about uncertain future events, is much rarer than commonly believed. Many "experts" are better at telling convincing stories than forecasting accurately.
Aggregate expert opinions. When dealing with domains where verified expertise exists (like weather forecasting or some financial analysis), it is better to aggregate the judgments of multiple experts rather than relying on a single "best" expert. Statistical groups of experts outperform individuals.
Weighting by track record. The most effective way to combine expert forecasts is often to weight them based on their objective track record of past accuracy on similar problems, rather than relying on their self-confidence or perceived status.
12. Prediction Markets & Public Comment Harness Dispersed Knowledge.
By and large, the outcomes of Google’s prediction markets turned out to be stunningly accurate.
Markets aggregate information. Prediction markets, where people bet on future outcomes, are powerful tools for aggregating dispersed information. Prices in these markets often accurately reflect the collective knowledge and predict probabilities of events.
Incentives for truth-telling. Prediction markets provide strong incentives for individuals to reveal their private information through their investments, overcoming the self-silencing issues in deliberation. Traders profit from being right, regardless of social pressures.
External input is valuable. Processes like government notice-and-comment rule-making or companies seeking public feedback can tap into the vast, dispersed knowledge held by people outside the immediate group. This external input can significantly improve decisions and increase legitimacy.
Last updated:
Review Summary
Wiser explores group decision-making, examining why groups often fail to correct individual errors and how they can improve. Readers found the book insightful but sometimes dry and academic. Many appreciated the analysis of group dynamics and biases, though some felt the practical advice was limited. The book's strengths include its examination of groupthink, polarization, and information cascades. While some found it repetitive or theoretical, others praised its relevance to organizational and political contexts. Overall, readers valued the book's contribution to understanding collective wisdom and group performance.
Similar Books









Download PDF
Download EPUB
.epub
digital book format is ideal for reading ebooks on phones, tablets, and e-readers.