Problem solving may be the single capability most critical to the success of the Lean organization. Whether you call it PDCA/PDSA, DMAIC, scientific thinking, or otherwise, problem solving is the heartbeat that pumps life into the continuous improvement effort. However, if we are to recruit everybody, every day, it’s important that we understand the factors that impede our abilities to solve problems effectively. There has been extensive discussion in the Lean annals of the underlying complex philosophical, managerial and social inhibitors of improvement efforts. All the while, a much more basic and fundamental piece of the puzzle has gone relatively unexplored: our human brains.
In this three-part article, we will take an in-depth look at the ways in which our human brains, specifically our intuitive patterns of thought, impede our ability to solve problems effectively. In part one, we will explore three common biases in our innate patterns of thought that give us a distorted view of the world and inhibit our ability to accurately grasp the current condition when investigating a problem. In the subsequent two parts, we’ll develop a practical process for overcoming these cognitive biases that builds off of the go and see approach to recruit underutilized portions of the brain into the problem solving process.
Go and see. Genchi genbutsu, if you are so inclined. It’s such a simple concept that it’s amazing that we have to spend so much time imploring its importance. To grasp the current condition, simply go and see the process. Do it well enough, and the problem will practically solve itself. Sounds easy, right? Wrong. Grasping the current condition is about separating truth from fiction. It’s about distilling the probable out of the plausible. It’s about the gemba – the real place – where the real facts must be determined. But, what is real? It’s not a rhetorical question.
The human brain does not perceive the world as it is, but rather as it might be. Our success as a species has been due in part to our brain’s ability to rapidly assess a situation based on limited and incomplete information for the ultimate purpose of ensuring our survival. Our view of the world is the one that keeps us safe, which is not necessarily the one that makes us correct. To our brains, it’s acceptable that the rustling in the tall grass behind us wasn’t really a saber-tooth tiger ready to pounce as long as we live to flea another day. Grasping the current condition seems much more difficult when you consider that at any given moment of your life, it is highly likely that your brain is lying to you.
Luckily, researchers in a wide variety of fields like psychology, neurobiology and behavioral economics have identified very specific patterns of thought which may lead to our self-deception. Although we can’t prevent our brains from subjecting us to these mental illusions, we can improve our perception of the world by being aware of how and when we are likely to fall victim to their effects. Based on the pioneering works of Nobel Prize winning researcher and author Daniel Kahneman, below are three common cognitive biases to which we fall victim when seeking to grasp the current condition.
- WYSIATI. No, it’s not some strange language and it’s not science-speak. It’s an acronym that stands for What You See Is All There Is. An important factor in our survival as a species is our brains’ ability to jump to conclusions based on limited evidence. Because our survival once depended on it, speed trumps fidelity on the cognitive priority list, and this bias is still the mental norm whether or not we are dealing with matters of life and death. The resulting problems that this causes are two-fold. First, we believe that what we see (or can recall from memory) is truly all that there is; if we do not see it (or cannot recall it), it might as well not exist to us. Second, we are generally unaware of the quality and quantity of the information our brains have used to form an impression. That is to say, our subconscious mind will compel us to jump to conclusions rapidly without first considering the extent to which we have accurately surmised the situation.
- Confirmation Bias. When our brain jumps to a conclusion, we automatically assume that the assessment of the world that led to the decision was perfect. Consequently, when evaluating the decision after the fact, we tend to seek evidence that will support our decisions, while discounting (or even ignoring) evidence which refutes our understanding. Even more, we also tend to interpret ambiguous evidence in a way that favors our positions. In other words, when our minds are made up, we are quick to validate our thinking without challenging the strength or accuracy of our stance. Our intuitive approach runs counter to scientific thinking in which both supporting and falsifying evidence are sought in order to either confirm or refute a hypothesis
- Implied Causality. An important factor in our survival was the ability to very quickly determine the causes of the events in the world around us; we jumped to conclusions about causality very readily in order to keep ourselves safe. However, in a problem solving environment, this approach leaves us prone to several different types of errors in judgment. We often imply causality, where there is only correlation or coincidence. Hence, why we believe that playing classical music to babies will make them smarter. Another unfortunate outcome of this bias is that we tend to assign cause to random events; in the absence of relevant statistical information, we are very keen on seeing patterns and assigning causes to our world where no such patterns exist.
If we take these three biases into account, a common pattern emerges that inhibits our ability to accurately grasp the current condition during the problem solving process. First, we subconsciously form an assessment of the situation based on limited and incomplete information from the world around us. Then, we quickly look for information that will substantiate our conclusions, while ignoring or discounting any evidence to the contrary. Finally, we infer a causal relationship to explain our understanding, often when only correlative or random relationships are present. The result is a poor understanding of our problems and the inability to achieve the desired pace of improvement.
In part two, we’ll continue our discussion of cognitive biases, taking a step back to understand the underlying biological causes of the flaws in our thinking. In doing so, we’ll introduce the Go See DaT technique, a method for grasping the current condition that builds on the traditional Go and See approach to protect us against the effects of our cognitive biases.
[…] part one of this three-part series, we established a very powerful – and, to some, even slightly […]