Searching...
English
English
Español
简体中文
Français
Deutsch
日本語
Português
Italiano
한국어
Русский
Nederlands
العربية
Polski
हिन्दी
Tiếng Việt
Svenska
Ελληνικά
Türkçe
ไทย
Čeština
Română
Magyar
Українська
Bahasa Indonesia
Dansk
Suomi
Български
עברית
Norsk
Hrvatski
Català
Slovenčina
Lietuvių
Slovenščina
Српски
Eesti
Latviešu
فارسی
മലയാളം
தமிழ்
اردو
The Precipice

The Precipice

Existential Risk and the Future of Humanity
by Toby Ord 2020 480 pages
Philosophy
Science
Politics
Listen
12 minutes

Key Takeaways

1. Humanity stands at a pivotal moment in history, facing unprecedented existential risks

If all goes well, human history is just beginning. Humanity is about two hundred thousand years old. But the Earth will remain habitable for hundreds of millions more—enough time for millions of future generations; enough to end disease, poverty and injustice forever; enough to create heights of flourishing unimaginable today.

Unprecedented power and peril. In the span of human history, our species has made remarkable progress, developing from small bands of hunter-gatherers to a global, technologically advanced civilization. This trajectory has accelerated dramatically in recent centuries, bringing tremendous gains in health, prosperity, and knowledge. However, our growing power has also created new existential risks - threats that could permanently destroy humanity's potential.

A critical juncture. We now find ourselves at a unique and precarious point in history. For the first time, we have the capacity to destroy ourselves through nuclear war, engineered pandemics, or other technologies. Yet we also have the potential to secure an incredibly bright future, spreading throughout the cosmos and flourishing for billions of years. The choices we make in the coming decades may determine which path we take.

The stakes are astronomical. If we navigate the current risks successfully, humanity could have an incredibly long and prosperous future ahead. We could spread to other planets, end scarcities, and achieve heights of flourishing far beyond our current imagination. But if we fail, we could cut off that entire future - trillions of lives that could have been lived, and cosmic-scale achievements that will never come to pass. Our generation thus bears an immense responsibility.

2. Natural risks to humanity are dwarfed by anthropogenic threats we've created

While there is still real risk, it has been studied in great detail and shown to be vanishingly low. It is a famous risk, but a small one. If humanity were to go extinct in the next century, it would almost certainly be from something other than an asteroid or comet impact.

Natural risks are small. Through careful study, scientists have determined that the risk of human extinction from natural causes like asteroid impacts, supervolcanoes, or stellar explosions is very low - likely less than 0.1% per century. While such events could cause major regional devastation, our global presence and technological capabilities make humanity quite resilient to these threats.

Human-caused risks dominate. The truly significant existential risks now come from our own actions and creations. Nuclear weapons, climate change, engineered pandemics, and artificial intelligence all pose much greater threats to humanity's long-term survival and flourishing. Unlike natural risks, these anthropogenic risks are increasing over time as our technological power grows.

We control our fate. The shift from primarily natural to primarily anthropogenic existential risks is actually cause for cautious optimism. While we can do little to prevent a supervolcanic eruption, we have much more control over the development and use of powerful technologies. Our fate is largely in our own hands - if we can learn to wield our growing powers responsibly.

3. Nuclear weapons and climate change pose significant but manageable existential risks

A war that would leave behind a dark age lasting centuries, before the survivors could eventually rebuild civilization to its former heights; humbled, scarred—but undefeated.

Nuclear winter is the key threat. While the immediate effects of a nuclear war would be devastating, the greater existential risk comes from the potential for nuclear winter. Firestorms in burning cities could loft enough soot into the atmosphere to block sunlight for years, causing global cooling and crop failures that could lead to mass starvation. However, human extinction from this scenario appears unlikely.

Climate risks are uncertain. The existential risks from climate change are less clear, but potentially severe in worst-case scenarios. While current warming projections are unlikely to directly cause human extinction, more extreme warming could potentially trigger cascading effects or push us past dangerous tipping points. Key uncertainties remain around potential feedback loops and the impacts of > 6°C warming.

Both require global action. Addressing these risks requires international cooperation to reduce nuclear arsenals, take weapons off hair-trigger alert, and rapidly transition to clean energy. While challenging, these are achievable goals if the public and policymakers prioritize them. Compared to future risks, nuclear war and climate change are relatively well-understood and manageable threats.

4. Future risks from engineered pandemics and artificial intelligence could be catastrophic

Humanity lacks the maturity, coordination and foresight necessary to avoid making mistakes from which we could never recover. As the gap between our power and our wisdom grows, our future is subject to an ever-increasing level of risk.

Engineered pandemics. Advances in biotechnology may soon allow the creation of pathogens far more deadly than anything in nature. A engineered pandemic combining the lethality of Ebola with the contagiousness of the flu could potentially cause billions of deaths. Unlike natural pandemics, engineered ones could be optimized for maximum lethality and spread.

Artificial General Intelligence (AGI). The development of AGI - AI systems with human-level general intelligence - could be an incredible boon for humanity. But it also poses possibly the largest existential risk. An AGI system that is not perfectly aligned with human values could rapidly become extremely powerful and pursue goals destructive to humanity, potentially even causing our extinction.

Unprecedented challenge. These emerging risks are especially dangerous because we have no historical experience in managing them. They require foresight and coordinated global action before the threats fully materialize. Our systems of governance and our moral wisdom are not yet up to the task of reliably navigating these challenges.

5. Existential risk is severely neglected despite its paramount importance

Humanity spends more on ice cream every year than on ensuring that the technologies we develop do not destroy us.

Massive scale, little attention. Despite the astronomical stakes involved, existential risk receives surprisingly little attention and resources. Annual global spending on existential risk reduction is likely less than $1 billion - orders of magnitude less than we spend on other global priorities, and less than we spend on ice cream.

Structural neglect. Several factors contribute to this neglect:

  • Scope insensitivity: We struggle to emotionally grasp the difference between millions and billions of lives
  • Incentive problems: Much of the benefit of reducing existential risk accrues to future generations who can't reward us
  • Unprecedented nature: Many risks have no historical precedent, making them easy to dismiss

Reversing the neglect. Increasing attention to existential risk is one of the most impactful things we can do. Even small increases in resources devoted to these issues could have an outsized positive impact on humanity's long-term future.

6. Safeguarding humanity requires global cooperation and institutional change

To survive these challenges and secure our future, we must act now: managing the risks of today, averting those of tomorrow, and becoming the kind of society that will never pose such risks to itself again.

Global challenges require global solutions. Many existential risks, like climate change or unaligned AI, affect all of humanity and require coordinated global action to address. We need to develop new international institutions focused on long-term risks and future generations.

Institutional innovations needed. Some promising ideas include:

  • A UN body focused on existential risk reduction
  • Including representatives for future generations in governance structures
  • Making reckless endangerment of humanity an international crime
  • Improving democratic institutions' ability to handle long-term, low-probability risks

Cultural shift required. Beyond institutional changes, we need a cultural shift to take these risks seriously and think on longer timescales. We must develop the civilizational virtues of patience, prudence, and global cooperation.

7. Our potential future is vast in scale, duration, and quality if we navigate current risks

If we can reach other stars, then the whole galaxy opens up to us. The Milky Way alone contains more than 100 billion stars, and some of these will last for trillions of years, greatly extending our potential lifespan.

Cosmic timescales. If we avoid existential catastrophe, humanity could potentially survive for hundreds of millions or even billions of years. We could outlast the Earth, spreading to other planets and star systems as our current home becomes uninhabitable in the very far future.

Galactic potential. Our galaxy alone contains hundreds of billions of stars, many with potentially habitable planets. If we develop the capability for interstellar travel, we could spread throughout the Milky Way, harnessing the resources of countless worlds and star systems.

Heights of flourishing. Given such vast timescales and resources, we could achieve incredible things:

  • End scarcities of all kinds, eliminating poverty and want
  • Develop technologies far beyond our current imagination
  • Create new art forms and modes of experience
  • Gain a deep understanding of the fundamental nature of reality
  • Achieve heights of wellbeing and flourishing far beyond our current peaks

8. Preserving humanity's long-term potential is our generation's defining challenge

We need to gain this wisdom; to have this moral revolution. Because we cannot come back from extinction, we cannot wait until a threat strikes before acting—we must be proactive. And because gaining wisdom or starting a moral revolution takes time, we need to start now.

A pivotal moment. Our generation lives at arguably the most important time in human history. Our actions over the coming decades could determine whether humanity achieves its vast potential or succumbs to existential catastrophe. We bear a profound responsibility.

The great project of our time. Reducing existential risk should be one of humanity's top priorities. This includes:

  • Direct work on specific risks like climate change, nuclear war, engineered pandemics, and AI alignment
  • Improving our global capacity to anticipate and address novel threats
  • Developing the wisdom and institutional structures to manage powerful new technologies
  • Spreading concern about humanity's long-term future

A call to action. While the challenge is immense, there is reason for cautious optimism. We have the power to navigate these risks if we take them seriously. Everyone can contribute, whether through their career choices, donations, or simply by spreading concern about these issues. Our descendants may remember us as the generation that safeguarded humanity's vast potential.

Last updated:

Review Summary

4.01 out of 5
Average of 4k+ ratings from Goodreads and Amazon.

The Precipice receives mostly positive reviews for its thorough examination of existential risks facing humanity. Readers appreciate Ord's clear arguments, accessible writing, and thought-provoking ideas about humanity's future potential. Many find the book inspiring and consider it an important contribution to discussions on existential risk. Some critics argue that certain risks are exaggerated or that the focus on distant future generations is misplaced. Overall, reviewers praise the book's scholarly approach and its call to action for safeguarding humanity's long-term future.

About the Author

Toby Ord is an Australian philosopher and researcher based at Oxford University. He is a prominent figure in the Effective Altruism movement and a Senior Research Fellow at the Future of Humanity Institute. Ord's work focuses on global priorities, existential risk, and the long-term future of humanity. He completed his graduate studies under the supervision of Derek Parfit, a renowned moral philosopher. Ord is known for his commitment to using rational analysis to determine how to do the most good in the world. His research has influenced many in the fields of philosophy, ethics, and global catastrophic risk reduction.

0:00
-0:00
1x
Create a free account to unlock:
Bookmarks – save your favorite books
History – revisit books later
Ratings – rate books & see your ratings
Listening – audio summariesListen to the first takeaway of every book for free, upgrade to Pro for unlimited listening.
Unlock unlimited listening
Your first week's on us!
Today: Get Instant Access
Listen to full summaries of 73,530 books. That's 12,000+ hours of audio!
Day 5: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 7: Your subscription begins
You'll be charged on Sep 28,
cancel anytime before.
Compare Features Free Pro
Read full text summaries
Summaries are free to read for everyone
Listen to full summaries
Free users can listen to the first takeaway only
Unlimited Bookmarks
Free users are limited to 10
Unlimited History
Free users are limited to 10
What our users say
15,000+ readers
“...I can 10x the number of books I can read...”
“...exceptionally accurate, engaging, and beautifully presented...”
“...better than any amazon review when I'm making a book-buying decision...”
Save 62%
Yearly
$119.88 $44.99/yr
$3.75/mo
Monthly
$9.99/mo
Try Free & Unlock
7 days free, then $44.99/year. Cancel anytime.