Here’s a concise summary of each of the 18 points from "How Complex Systems Fail" by Richard I. Cook, as presented on
https://how.complexsystems.fail/:
1. **Complex systems are intrinsically hazardous systems**: Complex systems (e.g., power grids, hospitals) inherently pose risks due to their scale and interactions, unlike simpler systems like bicycles.
2. **Complex systems are heavily and successfully defended against failure**: Multiple layers of defense—technology, procedures, and human oversight—make catastrophic failures rare.
3. **Catastrophe requires multiple failures – single point failures are not enough**: A disaster occurs only when several faults align, as single failures are usually mitigated by system redundancies.
4. **Complex systems contain changing mixtures of failures latent within them**: Hidden flaws persist in components and processes, evolving as the system operates.
5. **Complex systems run in degraded mode**: Systems often function despite ongoing minor failures, relying on built-in resilience.
6. **Catastrophe is always just around the corner**: The potential for sudden, major failure is constant due to the systems' complexity and latent issues.
7. **Post-accident attribution to a ‘root cause’ is fundamentally wrong**: Failures result from multiple interacting factors, not a single cause, though society prefers simple explanations.
8. **Hindsight biases post-accident assessments of human performance**: After a failure, actions seem obviously wrong, but they were reasonable decisions given the uncertainty at the time.
9. **Human operators have dual roles: as producers and as defenders against failure**: Operators both drive the system’s output and actively prevent its collapse.
10. **All practitioner actions are gambles**: Every decision carries risk, with outcomes only clear in retrospect, not when made.
11. **Actions at the sharp end resolve all ambiguity**: Frontline operators’ real-time choices clarify how the system actually behaves under pressure.
12. **Human practitioners are the adaptable element of complex systems**: People adjust to flaws and changes, keeping systems functional despite imperfections.
13. **Human expertise in complex systems is constantly changing**: Skills evolve as systems and conditions shift, requiring continuous learning.
14. **Change introduces new forms of failure**: Modifications to systems (e.g., upgrades) can create unexpected vulnerabilities.
15. **Views of ‘cause’ limit the effectiveness of defenses against future failure**: Focusing on specific causes overlooks broader systemic weaknesses, hindering prevention.
16. **Safety is an emergent property of systems**: It arises from the interplay of all components, not from any single part or safeguard.
17. **People continuously create safety**: Operators’ ongoing adjustments and vigilance maintain safety in dynamic, flawed systems.
18. **Failure free operations require experience with failure**: Understanding and managing failure firsthand is essential to prevent it, as theoretical perfection is unattainable.
These points collectively emphasize that failure in complex systems is inevitable, multifaceted, and managed through adaptability rather than eradicated entirely.