Searching...
English
EnglishEnglish
EspañolSpanish
简体中文Chinese
FrançaisFrench
DeutschGerman
日本語Japanese
PortuguêsPortuguese
ItalianoItalian
한국어Korean
РусскийRussian
NederlandsDutch
العربيةArabic
PolskiPolish
हिन्दीHindi
Tiếng ViệtVietnamese
SvenskaSwedish
ΕλληνικάGreek
TürkçeTurkish
ไทยThai
ČeštinaCzech
RomânăRomanian
MagyarHungarian
УкраїнськаUkrainian
Bahasa IndonesiaIndonesian
DanskDanish
SuomiFinnish
БългарскиBulgarian
עבריתHebrew
NorskNorwegian
HrvatskiCroatian
CatalàCatalan
SlovenčinaSlovak
LietuviųLithuanian
SlovenščinaSlovenian
СрпскиSerbian
EestiEstonian
LatviešuLatvian
فارسیPersian
മലയാളംMalayalam
தமிழ்Tamil
اردوUrdu
The Systems Bible

The Systems Bible

The Beginner's Guide to Systems Large and Small
by John Gall 1977 316 pages
3.99
1.0K ratings
Listen
Try Full Access for 7 Days
Unlock listening & more!
Continue

Key Takeaways

1. Systems Inherently Malfunction

SYSTEMS IN GENERAL WORK POORLY OR NOT AT ALL.

Universal observation. John Gall's "Systemantics" begins with the "Primal Scenario," a fundamental observation that "Things Don't Work Very Well." This isn't a new phenomenon, but an ongoing fact of the human condition, often attributed to special circumstances rather than inherent systemic flaws. The book posits that this widespread malfunction is not due to human incompetence or malevolence, but rather to the intrinsic nature of systems themselves.

Unexpected behavior. The "Generalized Uncertainty Principle" states that complex systems exhibit unexpected behavior. Our plans often go awry, producing results we never intended, sometimes even the opposite. This unpredictability is a core characteristic, whether in biological systems (the Harvard Law of Animal Behavior: "Under precisely controlled experimental conditions, a test animal will behave as it damn well pleases") or man-made ones.

Failure is the rule. This principle is most evident in "Climax Design," where the largest and most complex creations often fail in uncanny ways. For instance, the Space Vehicle Preparation Shed generates its own weather, and the Queen Elizabeth II's three boiler sets failed simultaneously. This suggests that malfunction is the norm, and flawless operation is the rare exception, a reality often denied by those immersed in system-building.

2. New Systems Create New Problems

NEW SYSTEMS MEAN NEW PROBLEMS

Problem multiplication. When a system is established to achieve a goal, it immediately introduces a new set of problems associated with its own functioning or mere existence. For example, setting up a garbage collection system doesn't eliminate the garbage problem; it adds issues like union negotiations, truck maintenance, and funding.

Anergy conservation. This phenomenon is encapsulated in the "Law of Conservation of Anergy," which states that the total amount of anergy (effort required to align reality with human desires) in the universe is constant. Systems merely redistribute this anergy into different forms and sizes. In very large systems, a "Relativistic Shift" can even cause the total anergy to increase exponentially.

Unintended consequences. The original problem may persist, while a multitude of new, festering problems arise. Large metropolitan garbage collection systems often fail to collect garbage, while simultaneously creating new issues like striking workers, disease outbreaks, and transportation breakdowns due to refuse. The system itself becomes a new, often worse, problem.

3. Systems Grow and Encroach Relentlessly

SYSTEMS TEND TO EXPAND TO FILL THE KNOWN UNIVERSE

Inherent growth. Systems, like babies, once acquired, tend to persist, grow, and encroach. Parkinson's Law noted that administrative systems grow at 5-6% per annum regardless of work. Gall generalizes this: "The System Itself... Tends to Grow at 5-6% Per Annum." This expansion is a fundamental drive, only inhibited by external forces.

Encroachment examples. This growth often manifests as encroachment, where the system pushes its work or costs onto others. Examples include:

  • "Do it yourself" movements, where consumers assemble products to save the system work.
  • The Internal Revenue Service making taxpayers compute their own taxes.
  • Self-service gas stations and supermarkets shifting labor to the consumer.

Loss of function. As systems expand and encroach, they paradoxically tend to lose basic functions. Supertankers, for instance, are too large to dock in most ports, and the U.S. Postal Service struggles to deliver a simple letter. This relentless growth often leads to a bizarre loss of the very capabilities they were designed to provide.

4. Systems Develop Their Own Selfish Goals

SYSTEMS DON’T WORK FOR YOU OR FOR ME. THEY WORK FOR THEIR OWN GOALS

Inner directives. Any system, the moment it comes into being, develops its own intrinsic goals, which often bear little relation to its stated purpose. These "intrasystem goals" always take precedence. For example, the United Nations might suspend critical global discussions to debate employee travel class.

Will to live. Fundamentally, a system behaves "as if it has a will to live," instinctively striving to maintain itself. This self-preservation drive makes dismantling even moribund systems incredibly difficult, as "The Ghost of the Old System Continues to Haunt the New." Nuclear power plants, after a few decades of use, require guarding for half a million years due to their radioactive byproducts.

Stated purpose vs. reality. The stated purpose of a system is often merely the "wish of the designer or manager," serving primarily to reveal the "delusion system" within which they operate. True understanding comes from recognizing that "the Purpose of the System is—whatever it can be used for," regardless of initial intentions.

5. Reality is Distorted Within Systems

THE SYSTEM ITSELF DOES NOT DO WHAT IT SAYS IT IS DOING

Operational fallacy. A core concept is the "Operational Fallacy": the function performed by a system is not operationally identical to the function of the same name performed by a person or a smaller system. For example, a "fresh apple" from a supermarket chain, picked green and chemically ripened, bears little resemblance to one picked directly from a tree.

Functionary's falsity. Similarly, "People in Systems Do Not Do What the System Says They Are Doing." A ship-builder in a large shipyard is actually writing contracts or planning budgets, not physically building ships. The larger and more complex the system, the less resemblance there is between a function and its name.

The F.L.A.W. The "Fundamental Law of Administrative Workings" (F.L.A.W.) states: "THINGS ARE WHAT THEY ARE REPORTED TO BE." This means the real world, for those within a system, is a filtered, distorted, and censored version. This sensory deprivation can lead to "Functionary's Fault," a state where system personnel develop "Hireling's Hypnosis," becoming oblivious to glaring errors, like a computer printing 50,000 identical bills for $111.11.

6. Systems Inevitably Kick Back

SYSTEMS TEND TO OPPOSE THEIR OWN PROPER FUNCTIONS

Le Chatelier's Principle. Borrowed from chemistry, this principle states that any natural process tends to set up conditions opposing its further operation. In systems, this translates to "The System Always Kicks Back" or "Systems Get In The Way." They actively resist their intended purpose.

Administrative encirclement. A prime example is the "Goals and Objectives mania" in organizations. When a botanist is forced to formalize his research goals, he becomes committed to a program that can be "objectively assessed," often leading to his "failure" if his actual, spontaneous research deviates from the rigid plan. The system, designed to improve efficiency, ends up hindering genuine scientific work.

Self-defeating outcomes. The very mechanisms designed to improve or control a system can become its greatest impediment. The administrators, originally meant to support professors, gain power over them, demonstrating how the system, in its attempt to function, opposes its own proper function. This "kickback" is a universal law, not a mere anomaly.

7. Beware of Positive Feedback Traps

ESCALATING THE WRONG SOLUTION DOES NOT IMPROVE THE OUTCOME

Runaway sequences. While negative feedback causes systems to oppose their function, uncontrolled positive feedback is dangerous. It can lead to a "loud squeal and loss of function" in electronic systems, or "things vibrating out of control" in political rallies. Positive feedback encourages a system's tendency to "Expand to Fill the Known Universe."

The Nasal Spray Effect. This phenomenon illustrates how applying more of a failing remedy only worsens the problem. Just as nasal spray users experience worse rebound stuffiness, reformers caught in a "positive feedback trap" escalate their efforts, making things worse. This leads to a "Runaway Sequence" where the "Solution has become part of the Problem."

Strangeness as a signal. When "things are acting very strangely" and common-sense solutions fail, it's a sign you're in a feedback situation. This "Strangeness" indicates a "thermostat" or self-referential point in the system. For example, the Green Revolution, by increasing food, allowed populations to grow to higher densities, leading to more starvation—the food supply became a thermostat for population growth.

8. Intervention is Complex and Often Fails

PUSHING ON THE SYSTEM DOESN’T HELP

Futility of force. A fundamental axiom states: "BIG SYSTEMS EITHER WORK ON THEIR OWN OR THEY DON’T. IF THEY DON’T, YOU CAN’T MAKE THEM." Trying to force a non-functioning system to work better, known as "Administrator's Anxiety," is futile. It's like trying to call an elevator by repeatedly pounding the button.

The Observer Effect. Any attempt to intervene or even observe a system alters it. The "Observer Effect" means "THE SYSTEM IS ALTERED BY THE PROBE USED TO TEST IT," and "THE PROBE IS ALTERED ALSO." Winnie-the-Pooh, probing his honey-pot, ended up with no honey and his head stuck. This makes objective intervention incredibly difficult, as the system reacts to the very act of being scrutinized.

Problem in the solution. The common adage "If you are not part of the solution, you are part of the problem" is a "pseudodoxy." The correct form is: "THE SOLUTION IS OFTEN PART OF THE PROBLEM." Systems don't solve problems; they are attempted solutions. When a solution becomes self-referential, it can perpetuate the very problem it was meant to fix, locking participants into perpetual failure, as seen in college "How to Study" courses where students fail the course itself.

9. Cherish Failures and Study Your Bugs

CHERISH YOUR BUGS. STUDY THEM.

Bugs are inevitable. All complex systems, especially software, will contain "Bugs," "Glitches," or "Gremlins." Improving component reliability merely shifts the problem: "IF IT DOESN’T FAIL HERE, IT WILL FAIL THERE." Intermittent failures are the hardest to diagnose, and a bug's full effects are rarely known.

Information from failure. Despite their nuisance, bugs offer invaluable information. Every malfunction reveals "one more way in which our System can fail." Since success often means "Avoiding the Most Likely Ways to Fail," studying bugs is crucial. This transforms a negative into a learning opportunity.

Bug or bonanza? Sometimes, a "bug" is an "unsuspected behavioral capability" or a "spontaneous offering of unsuspected capabilities." Dr. Fleming's "bug" (Penicillium fungus) in his bacterial cultures led to antibiotics. Charles Messier's "bugs" (fuzzy objects not comets) became the famous catalogue of star clusters and galaxies. The wise student asks: "BUG OR BONANZA?"

10. Design for Simplicity and Looseness

LOOSE SYSTEMS LAST LONGER AND FUNCTION BETTER

Avoid new systems. The first principle of systems design is negative: "DO IT WITHOUT A NEW SYSTEM IF YOU CAN." Systems are seductive, promising efficiency, but they consume time and effort in their own care and feeding. Once set up, they grow, encroach, and break down unexpectedly.

Design principles:

  • Occam's Razor: "AVOID UNNECESSARY SYSTEMS." Use existing, small systems if possible.
  • Agnes Allen's Law: "ALMOST ANYTHING IS EASIER TO GET INTO THAN OUT OF." Dismantling is often harder than setting up.
  • Systems Law of Gravity (S.L.O.G.): "SYSTEMS RUN BEST WHEN DESIGNED TO RUN DOWNHILL." Work with human tendencies, not against them.
  • Internal Friction Theorem: "LOOSE SYSTEMS LAST LONGER AND FUNCTION BETTER." Tight systems seize up or fly apart.

Bad design persists. "BAD DESIGN CAN RARELY BE OVERCOME BY MORE DESIGN, WHETHER GOOD OR BAD." Adding features to a flawed system is like "Pushing On The System." Frederick Brooks' "Bitter Bidding" advises: "PLAN TO SCRAP THE FIRST SYSTEM: YOU WILL ANYWAY."

11. Change Your Frame, Not Just the System

IF YOU CAN’T CHANGE THE SYSTEM, CHANGE THE FRAME—IT COMES TO THE SAME THING

Mental models. People's behavior within a system is shaped by their "Mental Model" of that system. If the system itself is intractable, changing this mental model can be equally effective. A jet pilot facing a seemingly short, wide runway resolves his problem by simply rotating his mental model 90 degrees.

Creative reframing. This is the art of "substituting useful metaphors for limiting metaphors." Talleyrand, at the Congress of Vienna, reframed France not as an aggressor but as a victim of Napoleon, shifting the focus from punishment to restoring rights. Bismarck embraced socialism when it was reframed as a "Standing Army in disguise."

Problem dissolution. A successful reframing doesn't "solve" the problem; it makes it "not even exist any more." Labels like "crime" or "socialism" are revealed as "artifacts of terminology," not permanent truths. This frees us from models that offer no solution, allowing us to seek more appropriate "Models of the Universe."

12. Avoid Grandiosity and Problem-Solving Fixation

A LITLE GRANDIOSITY GOES A LONG WAY

Limits to change. The "Limit Theorems" state: "YOU CAN’T CHANGE JUST ONE THING" and "YOU CAN’T CHANGE EVERYTHING." Attempting to correct more than three variables at once is "Grandiosity" and will fail. This applies to political plans, scientific research, and personal endeavors.

Perfectionism's paradox. "IN DEALING WITH LARGE SYSTEMS, THE STRIVING FOR PERFECTION IS A SERIOUS IMPERFECTION." The belief that "THE FINAL TRUTH IS JUST AROUND THE CORNER" or that "THE SYSTEM WILL BE PERFECT" after the current revision is a delusion. Perfectionism leads to tunnel vision, diverting energy from better approaches.

Focus on what works. Instead of grand, all-encompassing solutions, the seasoned Systems-student embraces "Survivors' Souffle": "IF IT’S WORTH DOING AT ALL, IT’S WORTH DOING POORLY." Great advances rarely come from systems designed to produce them; they "TAKE PLACE BY FITS AND STARTS," often from outsiders seeing the problem as a simple puzzle.

Last updated:

Want to read the full book?

Review Summary

3.99 out of 5
Average of 1.0K ratings from Goodreads and Amazon.

The Systems Bible receives mixed reviews, with an overall positive reception. Readers appreciate its humorous and insightful approach to systems theory, praising its concise writing and relevant examples. Many find it thought-provoking and applicable to various fields. Critics argue it lacks scientific rigor and can be overly cynical. The book's age is noted, with some examples feeling dated. Despite this, many readers consider it a valuable resource for understanding complex systems and their inherent challenges.

Your rating:
4.47
1 ratings

About the Author

John Gall was an American author and pediatrician known for his critique of systems theory. Born in 1925, he studied at St. John's College, George Washington University, and Yale before completing his pediatric training at the Mayo Clinic. Gall practiced pediatrics in Ann Arbor, Michigan, and was a faculty member at the University of Michigan for over 40 years. He authored "General systemantics" in 1975, which introduced Gall's law. After retiring in 2001, he moved to Minnesota and continued writing, publishing seven more books. Gall was a Fellow of the American Academy of Pediatrics and passed away in 2014.

Download PDF

To save this The Systems Bible summary for later, download the free PDF. You can print it out, or read offline at your convenience.
Download PDF
File size: 0.31 MB     Pages: 16

Download EPUB

To read this The Systems Bible summary on your e-reader device or app, download the free EPUB. The .epub digital book format is ideal for reading ebooks on phones, tablets, and e-readers.
Download EPUB
File size: 2.94 MB     Pages: 15
Listen
Now playing
The Systems Bible
0:00
-0:00
Now playing
The Systems Bible
0:00
-0:00
1x
Voice
Speed
Dan
Andrew
Michelle
Lauren
1.0×
+
200 words per minute
Queue
Home
Swipe
Library
Get App
Create a free account to unlock:
Recommendations: Personalized for you
Requests: Request new book summaries
Bookmarks: Save your favorite books
History: Revisit books later
Ratings: Rate books & see your ratings
200,000+ readers
Try Full Access for 7 Days
Listen, bookmark, and more
Compare Features Free Pro
📖 Read Summaries
Read unlimited summaries. Free users get 3 per month
🎧 Listen to Summaries
Listen to unlimited summaries in 40 languages
❤️ Unlimited Bookmarks
Free users are limited to 4
📜 Unlimited History
Free users are limited to 4
📥 Unlimited Downloads
Free users are limited to 1
Risk-Free Timeline
Today: Get Instant Access
Listen to full summaries of 73,530 books. That's 12,000+ hours of audio!
Day 4: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 7: Your subscription begins
You'll be charged on Aug 15,
cancel anytime before.
Consume 2.8x More Books
2.8x more books Listening Reading
Our users love us
200,000+ readers
"...I can 10x the number of books I can read..."
"...exceptionally accurate, engaging, and beautifully presented..."
"...better than any amazon review when I'm making a book-buying decision..."
Save 62%
Yearly
$119.88 $44.99/year
$3.75/mo
Monthly
$9.99/mo
Start a 7-Day Free Trial
7 days free, then $44.99/year. Cancel anytime.
Scanner
Find a barcode to scan

Settings
General
Widget
Loading...