I’ve been spending a lot of time recently doing research, reading and presenting on human cognitive biases. To the initiated, cognitive biases are defined as
“…a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own ‘subjective social reality’ from their perception of the input.” (Wikipedia Definition)
In other words, cognitive biases exist when there is a gap between our perception of reality and objective reality. For example, there is the “confirmation bias” which is our human tendency to seek out or interpret information that confirms one’s existing opinions.
While the term “cognitive bias” is relatively new (it was coined in 1972 by Amos Tversky and Daniel Kahneman), researchers have already uncovered literally over a hundred cognitive biases, some which are relatively tame like the “google effect” (or digital amnesia), where there is a tendency to forget information that can be easily researched, to ones that can lead to more disastrous consequences like the Sunk Cost Fallacy where people justify increased investment in a decision based on prior investment instead of looking only at future efficacy. The Sunk Cost Effect, along with the Overconfidence Effect and Receny Effect, played a role in the May 1996 mountain climbing tragedy, made famous in the movie Everest, that resulted in the death of five experienced climbers.
A great number of cognitive biases have been found through the work of behavioral economics researchers like Dan Ariely who wrote the wonderful books Predictably Irrational and The Upside of Irrationality. Underlying all of classic economics is the concept of homo economicus, or economic man who behaves in rational ways to maximize individual returns and acts in his own self-interest. Unfortunately, this is not the case and humans often act irrationally (and predictably so) because of their inherent cognitive biases. Humans all have biases for loss aversion and would choose to avoid loss over a larger corresponding potential gain and thus act as “homo irrationalis” as discovered by behavioral economics instead of “homo economicus” as predicted by classic economics.
It is our cognitive biases that cause us to make irrational decisions. Since behavioral economists found many of these cognitive biases, it was not a great leap to see how cognitive biases would be a paramount concern for the economics of software development. In my coaching practice, a great deal of my time and effort is used in helping organizations make better decisions about software development. Many times the optimal decisions are counter intuitive to people’s inherent biases so my job (and my passion) is helping companies see the world of software development differently so that, when it comes down to making a decision, they have all the knowledge necessary to make the optimal economic decision.
One of the most prevalent biases in software development is to see the world in a mechanistic / Tayloristic manner. Taylor’s viewpoint was fine for the old world of physical work, but does not hold up in the complex knowledge work being done by software development professionals today. Unfortunately, most of the people making software development decisions are predominantly influenced by this old, less optimal way of viewing the world, and, as a result, make sub-optimal decisions. For example, in the mechanistic worldview, adding more people to an effort results in a corresponding increase in output. If there is an existing team of seven people and we add seven more then we would (if we hold this mechanistic bias) expect the work to be approximately twice as fast. However, like the behavioral economists that found the real world to be counter intuitive to homo economicus, actual studies have found that the need for increased communication of knowledge work nearly outpaces any incremental increase in individual productivity (see Top Performing Projects Use Small Teams). I have always said that if you want to double productivity of a fourteen person team all that is necessary is to create two teams of seven.
The mechanistic bias can also be seen in many of the ways that the Agile philosophy is implemented. For example, the scrum framework is often trained as a series of ceremonies and actions with little or no understanding of the reason such mechanistic actions are successful. “Scrum Masters” are “certified” with only two days of training and a simple test. The training deals with ideal situations, but when the scrum master actually has to implement scrum, he or she is woefully unprepared. In the real world compromises and decisions must be made. Without understanding the underlying “why” of agile and the basic nature of software development, the decisions and compromises that are made are not optimal. In my experience, this is why project managers are tougher to train than people with no project management experience. When faced with ambiguous information and the need to make optimal decisions, project managers tend to fall back on existing mechanistic knowledge and the decisions made range from mildly irritating to completely disastrous. As I have often pointed out, to say that one was successful with waterfall reeks of confirmation bias because it begs the question of whether or not one would have been more successful using another methodology or framework like Lean or Scrum.
In addition to the mechanistic bias, software development suffers from another bias, the project-centric bias, which is the tendency to see all work done in terms of projects. Unfortunately, the project-centric bias is so ingrained in companies that there needs to be some radical changes to the way we view software development across all areas, including accounting. Viewing work as a project when we are actually working on software products results in a whole raft of poor software economic decisions like concentrating on features more than quality and security. Remember that no one washes a rental car.
As I think back on my coaching work in agile, the blogs I have written, the many discussions I have had and the presentations I have made, I think that all of these boil down into one very simple thing – my work is all about helping people understand the true nature of the software development business process and, thereby helping them to make better decisions. Understanding our cognitive biases, therefore, is extremely important for my clients and myself because, in the end, Agile is all about making better decisions.