Amplification: The act of calling out problems consistently so help is generated and swarms the problem to contain it and investigate, so causes can be found and corrective actions created to prevent recurrence.
Coherence: The quality of having a unified whole, which requires that elements that interact frequently and intensely are included in the same grouping so their interactions can be well managed, and that those that are not are excluded. This is necessary for the behavior of the whole to be logical and consistent.
Control system characteristics: In control theory, the control system (in our case, the management system) must have a frequency, speed, accuracy, and detail of control greater than the underlying system being controlled (as per the Nyquist-Shannon Theorem). Otherwise, the system being controlled will tend to instability or even chaos. (In system dynamics parlance, the structure is the extent to which there is isomorphism among Layers 1, 2, and 3, and the dynamics are the stability or instability of the system.)
Coupled: Two entities are coupled when a change in the state of one changes the state (the condition) of the other.
Decoupled: Two entities are decoupled when a change in the state of one does not change the state (the condition) of the other.
Functionally organized: In functional organizations, experts are responsible for ensuring people within that function can do work according to the standards expected of that profession. However, when functional managers also try to determine the timing of work, they risk interference between functions that haven’t been adequately synchronized.
Incrementalization: A technique within simplification of partitioning a large problem-solving effort (a great leap) into small, incremental steps. This involves establishing a stable base and then iterating and testing changes in a few factors at a time as opposed to testing the effect of changing many factors all at once.
Isomorphism: The quality of related items having similar structures so they can fit and operate together (e.g., “hand in glove”). In our context, we use isomorphic most frequently to describe to what extent the Layer 3 social circuitry supports and enables the work being done in Layer 1 (technical object) and Layer 2 (tooling and instrumentation). When Layer 3 is not sufficiently isomorphic, the organization is in the danger zone.
(Isomorphism can also apply to other layers. An example of Layers 1 and 2 not being isomorphic: the tools aren’t available at the right time for work to occur. An example of Layers 2 and 3 not being isomorphic: parts, materials, information, etc. are not in the right place at the right time for work to occur.)
Joint problem-solving: The activity where solving a problem requires two or more people to identify, describe, characterize, investigate, and resolve, who must actively exchange ideas, information, perspectives, etc. in a real-time, nuanced, noncoded fashion. (See also: Moving a couch.)
Knowledge capture and knowledge sharing: The deliberate commitment to (a) codify what’s discovered when problems are seen and solved, so similar experiences don’t recur locally and (b) share what has been discovered, so similar experiences can be avoided elsewhere throughout the system. How knowledge can be usefully shared varies, depending on what has to be conveyed from whom to whom, and about what. It could be as visually simple as directions on assembling an IKEA® cabinet; more complex instructions like in a cookbook; more elaborate like in a journal article, a physical part such as a jig, or code or automated tests in software; all the way to the sophistication of a simulation or virtualization, or recreated, shared problem-solving experience.*
Layer 1 problem: A problem with the object on which work is being done (e.g., “I don’t understand the design or the function of this thing.”).
Layer 2 problem: A problem with the instrumentation or equipment used in the work (e.g., “I’m having problems with the equipment needed to make the part.”).
Layer 3 problem: A problem with the social circuitry or organizational wiring (e.g., “I don’t even know what part I’m supposed to be making right now.”).
Linearization: A technique within simplification of sequencing tasks associated with completing a larger set of work so that they flow successively, like a baton being passed from one person to the next. What follows is standardization for those sequences, for exchanges at partition boundaries, and for how individual tasks are performed. This creates opportunities to introduce stabilization, so when a problem occurs, that triggers a reaction that contains the problem and prevents it from enduring and from its effects spreading.
Those allow for self-synchronization, so the system is self-pacing without top-down monitoring and direction.
Limitations of expediting: A reaction to a failed schedule when someone attempts to, on the fly, redirect people and products and reassign processes to “keep things moving.” For large, complex, fast-moving systems, expeditors cannot keep pace. The problem is, they’re making decisions that seem to make sense immediately and locally but might actually make matters worse—like Gene and Steve pulling a “spare painter or mover” from one room to help in another, only to realize they’ve now got compromises in both.
Limitations of scheduling: In a system that is too scheduled, it is assumed that the antidote for failures of a functionally managed system is building complex schedules that determine who does what, when, and where. The failure mode for that is trying to arrive at a “solution” that is comprehensive across all the work and all the workers. It turns out that arriving at a solution requires so many computations and calculations that it borders on impossible to generate. So, even with the best of intentions to adhere to a schedule, generating a schedule that is precise enough to solve the coordination problem often cannot be done.†
Modularization: A technique within simplification of partitioning a system that is unwieldy in its size, complexity, and inter-wiredness of relationships among its many component pieces into more, smaller, simpler, coherent pieces.
Moving a couch: An example of a situation in which those tasked with solving a problem and completing a task are coupled in their undertaking and have to engage in joint problem-solving and so must be grouped in a coherent fashion. (See also: Joint problem-solving.)
Moving-and-painting: An example of a situation that starts out poorly managed, with a chaotic and frustrating experience for the participants, resulting in a disappointing performance. The scarce and precious resource of the participants’ time and creativity is exhausted on figuring out what to do, when to do it, and with whom coordination has to occur (organizational wiring issues in Layer 3), leaving too little of those resources left to solve the actual problems of moving furniture and painting rooms (Layers 1 and 2 of technical objects and tooling and instrumentation).
However, the systematic application of slowification, simplification, and amplification mechanisms reduces the distractions of figuring out how to fit into the larger enterprise and makes it quicker and easier to solve practical problems and do the actual work for which people have been engaged.
Simplification: Reducing the number of interactions one component of the system has with other components of the same system (e.g., technical interactions between component parts in an engineered system or among people in a working group). Simplification contains three techniques: incrementalization, modularization, and linearization.
Slowification: Shifting problem-solving from performance (operation, execution) back to practice (preparation) and planning with forceful backup, stress testing, and other deliberate ways of finding flaws in thinking before they become flaws in doing.
Social circuitry (organizational wiring): The connections by which ideas, information, services, and support can flow from where they are to where they are needed so that effective, collaborative problem-solving and value creation can occur. It is the overlay of processes, procedures, routines, and norms by which individual efforts are integrated through collective action toward a common purpose.
Some of the highlights from the book are thought provoking, and bitter lessons to take in. Some things are very familiar as well. We do some of the foundational processes mentioned, we used to do them a great deal more and better. We have become more bureaucratic over the years, doubling our size around the turn of century has led to a slow crushing of the provebiel soul of the institution. Even with the size increase, it appears we could still pull together and find our way back to our original geek foundations. There is a great deal of knowledge and wisdom to be taken away from this book. Andrew is a great writer and very intelligent thinker. He has brought an enormous light that highlights all the most difficult challenges we have in understanding ourselves and harnessing our own creativity and energy.
The Four Norms of the Geek Way
McAfee argues that “geek companies” succeed by embracing four core cultural norms that are expected of each other:
Speed – favoring rapid experimentation and iteration over slow, hierarchical decision-making.
Ownership – pushing responsibility and autonomy down to small, empowered teams.
Science – using data, testing, and evidence to guide decisions instead of authority or intuition.
Openness – encouraging transparency, information-sharing, and free flow of ideas.
"If you want to help win the war, inflict standard corporate operating procedure on the other side. That’s a conclusion you’re likely to reach after reading the Simple Sabotage Field Manual, which was produced in 1944 by the US Office of Strategic Services (the predecessor of the CIA). When I first heard about this document I thought it was an urban legend, but it’s real. It was declassified in 2008 and is now discussed on the CIA’s website."
The manual was aimed at people living in Norway, France, and other countries occupied by the Nazis during World War II. It offers advice to “citizen saboteurs” on how to vex their occupiers by doing everything from starting fires to clogging up toilets. But not all the recommended damage is physical; some of it is organizational. As the manual states, “[One] type of simple sabotage requires no destructive tools whatsoever… It is based on universal opportunities to make faulty decisions, to adopt a noncooperative attitude, and to induce others to follow suit.” Here are some of its “universal opportunities” for making things worse: Insist on doing everything through “channels.” Never permit shortcuts to be taken in order to expedite decisions… Be worried about the propriety of any decision—raise the question of whether such action as is contemplated lies within the jurisdiction of the group or whether it might conflict with the policy of some higher echelon… Multiply the procedures and clearances involved in issuing instructions, paychecks, and so on. See that three people have to approve everything where one would do.
In 2011 Microsoft made its bad situation worse by introducing a performance review system called stack ranking aka Vitality Curve. It required managers to rank the members of their teams, and to designate the lowest-ranking ones as “below average” or worse. This had the immediate effect of making status rivalries both more common and more intense. No matter how successful or prestigious they were, people were forced into competition with their colleagues. Stack ranking created a clear human pecking order. Those that wound up highest on it got raises and promotions; those at the bottom often got fired. The resulting fights weren’t literally bloody, but they were otherwise as vicious as anything seen in chicken pens. And they were essentially never-ending, since new rankings were required twice a year. Eichenwald found that “every current and former Microsoft employee I interviewed—every one—cited stack ranking as the most destructive process inside of Microsoft.” It led to infighting, Machiavellianism, and warring factions. One thing it didn’t lead to was better results for the business.
Speed
Meaning: Move fast; favor rapid iteration over analysis paralysis
Practical Example: Amazon runs thousands of small A/B tests daily
Application Beyond Tech: Hospitals trial new patient check-in flows in weeks, not years
Ownership
Meaning: Push autonomy to small teams; responsibility stays local
Practical Example: Netflix engineers own their code from design to production
Application Beyond Tech: Manufacturing cells manage their own output quality
Science
Meaning: Base decisions on data, not rank or gut feeling
Practical Example: Google Search rankings evolve via constant testing
Application Beyond Tech: Cities use pilot programs + data to guide policy (e.g., traffic calming)
Openness
Meaning: Radical transparency; ideas > hierarchy
Practical Example: Open-source software projects thrive on contribution and critique
Application Beyond Tech: Consulting firms share dashboards & internal metrics openly
Run many small experiments → accept most will fail, learn fast.
Flatten hierarchy → data beats seniority.
Cultural discipline → strong norms prevent chaos.
Psychological safety → dissent is welcomed if backed by evidence.
Handles uncertainty better than “predict & control.”
Unlocks innovation by lowering the cost of trying new things.
Builds resilience through rapid feedback and adaptation.
Bureaucratic drag → paralysis in fast-moving environments.
Over-planning → high cost of wrong bets.
Rule-heavy cultures → stifle initiative, slow response.
Start small: empower a few pilot teams with autonomy.
Measure relentlessly: replace opinion battles with data.
Encourage openness: share info across silos.
Institutionalize speed: shorten cycle times everywhere.