Matthew Syed’s second book, Black Box Thinking, followed on from his first on the science of high performance, Bounce. In Black Box Thinking, Syed uses the lenses of the healthcare and aviation industries to examine why a failure to learn from mistakes has been one of the single greatest obstacles to human progress. To him, the healthcare profession has been stymied by an inherent inability to forge successes from the mould of previous failures.
Syed uses the example of bloodletting, whereby blood is drawn from a sick patient to prevent or cure illnesses, to introduce the concept of ‘closed loops’. This approach to patient care survived until the 19th century, since no one thought to properly examine the efficacy of the method. Instead, if someone died after bloodletting they were pronounced too ill to be saved in the first place. If they survived, it was bloodletting that saved their life. All it took was for someone to take the time to examine the evidence properly to discover that you were far likelier to die after bloodletting than if you hadn’t had your blood drained from your body.
Using aviation to prove the same point, Syed talks about a much more recent calamity: United Flight 173 in 1978. Having had a routine flight out of JFK Airport in New York, heading to Portland, the crew felt a sudden vibration as the landing gear was being lowered. The indicator light for the landing gear didn’t light up. The pilot and crew were so fixated upon this problem, which they spent the next hour trying to resolve, that not one of the three flight crew monitored the fuel levels. The plane ended up crashing into suburban Portland six miles from the airport, having run entirely out of fuel. Attention is a scarce resource – if you focus on one thing, you will lose awareness of everything else.
The difference between the healthcare and aviation professions is highlighted here: United Flight 173 was thoroughly investigated and the way in which airline crewmembers were trained was altered. The paradox of success is that it is built upon failure – in fact Syed calls success “a mountain of necessary failure”.
The next part of the book is focused on the psychology of failure. Syed writes that when we are confronted with evidence that challenges our deeply-held beliefs, we are more likely to reframe the evidence than we are to alter our beliefs. Think back to the earlier example of bloodletting: when patients died shortly after undergoing bloodletting, surgeons would reframe the evidence to confirm their pre-held beliefs, rather than let that evidence challenge their actions.
This is called ‘cognitive dissonance’, and denotes the discomfort of being faced with facts that contradict your beliefs. In the Iraq War, Tony Blair would reframe the narrative every time new information came to light, in order to continue to argue that there were Weapons of Mass Destruction (WMDs) in Iraq. This touches upon another phenomenon named the ‘narrative fallacy’: we are so eager to impose patterns upon what we see, so hardwired to provide explanations, that we are capable of explaining opposite outcomes with the same cause – without noticing the inconsistency. Again, think back to the example of bloodletting. Syed is in danger of restating his point too often here, but the myriad examples he provides hammer home the argument that humans are, by nature, hardwired to avoid learning from failure.
The final and most useful section of Black Box Thinking concerns the business cases for Syed’s earlier points and suggestions. He begins by begging us not to be afraid of early failure. In fact, many of today’s largest and most successful companies – Facebook and Amazon for instance – released minimum viable products (MVPs) early in their development. Syed argues that there are distinct benefits with going to market with an imperfect product, rather than sheltering your product from the market until its ‘perfect’.
Another lesson to take away is appreciating the benefits of a bottom-up, as well as a top-down approach. Syed had earlier given the example of Virginia Mason, a hospital in Seattle, Washington, putting in place Patient Safety Alerts that enabled junior members of staff to speak up immediately if they identified systematic problems and instances of professional negligence. This not only saved lives but served to improve and streamline the hospital’s processes. The same principles can be applied in the workplace. Essentially, it is imperative to foster an environment in which junior members of staff can, without fear, suggest ways to improve existing and oftentimes long-established processes.
Syed ends with two further actionable takeaways. The first concerns marginal gains: he says that breaking down one large goal into smaller goals is an effective way to tackle cumbersome tasks. The second is about how to brainstorm. This process is already ubiquitous in the workplace, and it is proven that the encouragement of debate and welcoming of criticism stimulates more creative ideas. However, Syed introduces the ‘pre-mortem’, whereby you think about why your ideas may fail before you implement them. By picking away at the shortcomings, nascent ideas are made more efficient and likely to succeed.
Overall, Syed is strongest when making lofty comparisons between the healthcare and aviation professions. When it comes to providing the reader with clear and actionable insights, Syed is somewhat lacking. Nevertheless Black Box Thinking is well worth the read, as the examples Syed employs to illustrate his points on the need for failure to create success are more than enough to get you thinking and changing the fundamentals of how you work.
You can buy the book here: https://www.amazon.co.uk/Black-Box-Thinking-Surprising-Success/dp/1473613779